The post describes an issue with a HuggingFaceLLM object, where a ValidationError is raised. The community member mentions that the issue is related to the query_wrapper_prompt parameter, which they usually fine-tune but is not working today.
In the comments, another community member suggests that the type of the query_wrapper_prompt parameter may have changed, and recommends using a string instead of wrapping it with a SimpleInputPrompt.
lidationError Traceback (most recent call last) Cell In[17], line 20 9 SYSTEM_PROMPT = """You are an AI assistant that answers questions in a friendly manner, based on the given source documents. Here are some rules you always follow: 10 - Generate human readable output, avoid creating output with gibberish text. 11 - Generate only the requested output, don't include any other language before or after the requested output. 12 - Never say thank you, that you are happy to help, that you are an AI agent, etc. Just answer directly. 13 - Generate professional language typically used in business documents in North America. 14 """ 16 query_wrapper_prompt = SimpleInputPrompt( 17 "[INST]<<SYS>>\n" + SYSTEM_PROMPT + "<</SYS>>\n\n{query_str}[/INST]" 18 ) ---> 20 llm = HuggingFaceLLM( 21 context_window=4096, 22 max_new_tokens=2048, 23 generate_kwargs={"temperature": 0.0, "do_sample": False}, 24 query_wrapper_prompt=query_wrapper_prompt, 25 tokenizer_name=selected_model, 26 model_name=selected_model, 27 device_map="auto", 28 # change these settings below depending on your GPU 29 model_kwargs={"torch_dtype": torch.float}, 30 )