----------------------
The error message you're seeing is indicating that the
query_wrapper_prompt
parameter is expecting a string, but it's receiving an instance of
SimpleInputPrompt
.
In the examples provided in the extracts, the
query_wrapper_prompt
is initialized as an instance of
SimpleInputPrompt
, but when it's passed to the
HuggingFaceLLM
constructor, it's used directly as a string.
Here's an example from the extracts:
from llama_index.prompts.prompts import SimpleInputPrompt
query_wrapper_prompt = SimpleInputPrompt("<|USER|>{query_str}<|ASSISTANT|>")
And then it's used in the
HuggingFaceLLM
constructor:
llm = HuggingFaceLLM(
...
query_wrapper_prompt=query_wrapper_prompt,
...
)
It seems like there might be a discrepancy between the way you're using
query_wrapper_prompt
and the way it's used in the examples.
You might want to try using the
query_wrapper_prompt
directly as a string when passing it to the
HuggingFaceLLM
constructor, like so:
query_wrapper_prompt = "<|USER|>{query_str}<|ASSISTANT|>"
llm = HuggingFaceLLM(
...
query_wrapper_prompt=query_wrapper_prompt,
...
)
If this doesn't resolve your issue, I would recommend checking the
LlamaIndex documentation () for more information on how to use the
query_wrapper_prompt
parameter.
----------------------
Note: you can ask me a follow up question by @ mentioning me again
:speech_balloon:
----------------------