Find answers from the community

Updated 2 years ago

Prompt

At a glance

The post describes an issue with a HuggingFaceLLM object, where a ValidationError is raised. The community member mentions that the issue is related to the query_wrapper_prompt parameter, which they usually fine-tune but is not working today.

In the comments, another community member suggests that the type of the query_wrapper_prompt parameter may have changed, and recommends using a string instead of wrapping it with a SimpleInputPrompt.

lidationError Traceback (most recent call last)
Cell In[17], line 20
9 SYSTEM_PROMPT = """You are an AI assistant that answers questions in a friendly manner, based on the given source documents. Here are some rules you always follow:
10 - Generate human readable output, avoid creating output with gibberish text.
11 - Generate only the requested output, don't include any other language before or after the requested output.
12 - Never say thank you, that you are happy to help, that you are an AI agent, etc. Just answer directly.
13 - Generate professional language typically used in business documents in North America.
14 """
16 query_wrapper_prompt = SimpleInputPrompt(
17 "[INST]<<SYS>>\n" + SYSTEM_PROMPT + "<</SYS>>\n\n{query_str}[/INST]"
18 )
---> 20 llm = HuggingFaceLLM(
21 context_window=4096,
22 max_new_tokens=2048,
23 generate_kwargs={"temperature": 0.0, "do_sample": False},
24 query_wrapper_prompt=query_wrapper_prompt,
25 tokenizer_name=selected_model,
26 model_name=selected_model,
27 device_map="auto",
28 # change these settings below depending on your GPU
29 model_kwargs={"torch_dtype": torch.float},
30 )

File /usr/local/lib/python3.10/dist-packages/llama_index/llms/huggingface.py:150, in HuggingFaceLLM.init(self, context_window, max_new_tokens, system_prompt, query_wrapper_prompt, tokenizer_name, model_name, model, tokenizer, device_map, stopping_ids, tokenizer_kwargs, tokenizer_outputs_to_remove, model_kwargs, generate_kwargs, callback_manager)
146 return False
148 self._stopping_criteria = StoppingCriteriaList([StopOnTokens()])
--> 150 super().init(
151 context_window=context_window,
152 max_new_tokens=max_new_tokens,
153 system_prompt=system_prompt,
154 query_wrapper_prompt=query_wrapper_prompt,
155 tokenizer_name=tokenizer_name,
156 model_name=model_name,
157 device_map=device_map,
158 stopping_ids=stopping_ids or [],
159 tokenizer_kwargs=tokenizer_kwargs or {},
160 tokenizer_outputs_to_remove=tokenizer_outputs_to_remove or [],
161 model_kwargs=model_kwargs or {},
162 generate_kwargs=generate_kwargs or {},
163 callback_manager=callback_manager,
164 )

File /usr/local/lib/python3.10/dist-packages/pydantic/main.py:341, in pydantic.main.BaseModel.init()

ValidationError: 1 validation error for HuggingFaceLLM
query_wrapper_prompt

using llama 2 that i fine tuned it usually works but today ti doesnt
L
D
2 comments
I think the type changed for query wrapper prompt

Use string instead of wrapping with simple input prompt
Add a reply
Sign up and join the conversation on Discord