Find answers from the community

Updated last year

this is my query wrapper: query_wrapper_prompt = SimpleInputPrompt("<|USER|>{query_str}<|A

@kapa.ai this is my query wrapper: query_wrapper_prompt = SimpleInputPrompt("<|USER|>{query_str}<|ASSISTANT|>")

but iam getting this error:
-----------------------
ValidationError Traceback (most recent call last)
Cell In[1], line 25
23 qa_template = Prompt(template)
24 # Initialize HuggingFaceLLM
---> 25 llm = HuggingFaceLLM(
26 context_window=1028,
27 max_new_tokens=256,
28 generate_kwargs={"temperature": 0.7, "do_sample": False},
29 system_prompt=system_prompt,
30 query_wrapper_prompt=query_wrapper_prompt,
31 tokenizer_name="StabilityAI/stablelm-tuned-alpha-3b",
32 model_name="StabilityAI/stablelm-tuned-alpha-3b",
33 device_map="cuda",
34 stopping_ids=[50278, 50279, 50277, 1, 0],
35 tokenizer_kwargs={"max_length": 1028},
36 # uncomment this if using CUDA to reduce memory usage
37 model_kwargs={"torch_dtype": torch.float16}
38 )


File /opt/conda/envs/sagemaker-distribution/lib/python3.8/site-packages/pydantic/main.py:341, in pydantic.main.BaseModel.init()

ValidationError: 1 validation error for HuggingFaceLLM
query_wrapper_prompt
str type expected (type=type_error.str)
k
2 comments
Please stand by @Ftm as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
The error message you're seeing is indicating that the query_wrapper_prompt parameter is expecting a string, but it's receiving an instance of SimpleInputPrompt.

In the examples provided in the extracts, the query_wrapper_prompt is initialized as an instance of SimpleInputPrompt, but when it's passed to the HuggingFaceLLM constructor, it's used directly as a string.

Here's an example from the extracts:

Plain Text
from llama_index.prompts.prompts import SimpleInputPrompt

query_wrapper_prompt = SimpleInputPrompt("<|USER|>{query_str}<|ASSISTANT|>")


And then it's used in the HuggingFaceLLM constructor:

Plain Text
llm = HuggingFaceLLM(
    ...
    query_wrapper_prompt=query_wrapper_prompt,
    ...
)


It seems like there might be a discrepancy between the way you're using query_wrapper_prompt and the way it's used in the examples.

You might want to try using the query_wrapper_prompt directly as a string when passing it to the HuggingFaceLLM constructor, like so:

Plain Text
query_wrapper_prompt = "<|USER|>{query_str}<|ASSISTANT|>"

llm = HuggingFaceLLM(
    ...
    query_wrapper_prompt=query_wrapper_prompt,
    ...
)


If this doesn't resolve your issue, I would recommend checking the LlamaIndex documentation () for more information on how to use the query_wrapper_prompt parameter.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord