Find answers from the community

Updated 3 months ago

I am facing the following error when using cuda as a device map:-------------------------

I am facing the following error when using cuda as a device map:
---------------------------------------------------------------------------
ValidationError Traceback (most recent call last)
Cell In[1], line 25
23 qa_template = Prompt(template)
24 # Initialize HuggingFaceLLM
---> 25 llm = HuggingFaceLLM(
26 context_window=1028,
27 max_new_tokens=256,
28 generate_kwargs={"temperature": 0.7, "do_sample": False},
29 system_prompt=system_prompt,
30 query_wrapper_prompt=query_wrapper_prompt,
31 tokenizer_name="StabilityAI/stablelm-tuned-alpha-3b",
32 model_name="StabilityAI/stablelm-tuned-alpha-3b",
33 device_map="cuda",
34 stopping_ids=[50278, 50279, 50277, 1, 0],
35 tokenizer_kwargs={"max_length": 1028},
36 # uncomment this if using CUDA to reduce memory usage
37 model_kwargs={"torch_dtype": torch.float16}
38 )


File /opt/conda/envs/sagemaker-distribution/lib/python3.8/site-packages/pydantic/main.py:341, in pydantic.main.BaseModel.init()

ValidationError: 1 validation error for HuggingFaceLLM
query_wrapper_prompt
str type expected (type=type_error.str)
k
F
L
16 comments
Please stand by @Ftm as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
The error message you're seeing is a ValidationError from Pydantic, which is a data validation library used in Python. This error is being raised when you're trying to initialize the HuggingFaceLLM class. Specifically, it's complaining about the query_wrapper_prompt argument.

The error message str type expected (type=type_error.str) suggests that the query_wrapper_prompt argument is expecting a string type, but it's receiving something else.

In the extracts provided, the query_wrapper_prompt is usually an instance of a Prompt class, like SimpleInputPrompt. Here's an example from the extracts:

Plain Text
from llama_index.prompts.prompts import SimpleInputPrompt

query_wrapper_prompt = SimpleInputPrompt(
    "Below is an instruction that describes a task. "
    "Write a response that appropriately completes the request.\n\n"
    "### Instruction:\n{query_str}\n\n### Response:"
)


You should ensure that the query_wrapper_prompt argument is correctly defined before passing it to the HuggingFaceLLM class. If you're still facing issues, please refer to the LlamaIndex documentation () for more detailed information.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Let me check the knowledge base for you. One moment! :books:
Let me check the knowledge base for you. One moment! :books:
----------------------
The query_wrapper_prompt argument in the HuggingFaceLLM class expects an instance of a Prompt class, not a string. The Prompt class is used to format the query string before it's passed to the language model.

In your case, you're passing a string directly to query_wrapper_prompt, which is causing the ValidationError.

Here's an example of how you can define a Prompt:

Plain Text
from llama_index.prompts.prompts import SimpleInputPrompt

query_wrapper_prompt = SimpleInputPrompt(
    "Below is an instruction that describes a task. "
    "Write a response that appropriately completes the request.\n\n"
    "### Instruction:\n{{query_str}}\n\n### Response:"
)


In this example, SimpleInputPrompt is a class that inherits from Prompt. It takes a string as an argument, which is a template for formatting the query string. The {{query_str}} part in the template is a placeholder that gets replaced with the actual query string.

After defining query_wrapper_prompt like this, you can pass it to HuggingFaceLLM:

Plain Text
llm = HuggingFaceLLM(
    # other arguments...
    query_wrapper_prompt=query_wrapper_prompt,
    # other arguments...
)


If you're still facing issues, please refer to the LlamaIndex documentation () for more detailed information.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai
based on your previous answer the query wrapper in my code contains only string, but i still having this issue, what to do in this case?
Let me check the knowledge base for you. One moment! :books:
Let me check the knowledge base for you. One moment! :books:
Sorry @Ftm, I seem to be having a bit of trouble. Don't worry though, I've pinged my supprt and they will try to fix me up! :technologist:

Perhaps try again in 15mins?
----------------------
The query_wrapper_prompt argument in the HuggingFaceLLM class expects an instance of a Prompt class, not a string. The Prompt class is used to format the query string before it's passed to the language model.

If you're passing a string directly to query_wrapper_prompt, you'll get a ValidationError because it's expecting a Prompt object.

Here's an example of how you can create a Prompt object:

Plain Text
from llama_index.prompts.prompts import SimpleInputPrompt

query_wrapper_prompt = SimpleInputPrompt(
    "Below is an instruction that describes a task. "
    "Write a response that appropriately completes the request.\n\n"
    "### Instruction:\n{{query_str}}\n\n### Response:"
)


In this example, SimpleInputPrompt is a class that inherits from Prompt. The string passed to SimpleInputPrompt is a template that's used to format the query string. The {{query_str}} part of the template is where the query string will be inserted.

Once you've created the Prompt object, you can pass it to HuggingFaceLLM like this:

Plain Text
llm = HuggingFaceLLM(
    # other arguments...
    query_wrapper_prompt=query_wrapper_prompt,
    # other arguments...
)


If you're still facing issues, please refer to the LlamaIndex documentation () for more detailed information.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
----------------------
The query_wrapper_prompt argument in the HuggingFaceLLM class expects an instance of a Prompt class, not a string. The Prompt class is used to format the query string before it's passed to the language model.

In your case, you're passing a string directly to query_wrapper_prompt, which is causing the ValidationError.

Here's an example of how you can define a Prompt:

Plain Text
from llama_index.prompts.prompts import SimpleInputPrompt

query_wrapper_prompt = SimpleInputPrompt(
    "Below is an instruction that describes a task. "
    "Write a response that appropriately completes the request.\n\n"
    "### Instruction:\n{query_str}\n\n### Response:"
)


In this example, SimpleInputPrompt is a class that inherits from Prompt. It takes a string as an argument, which is a template for formatting the query string. The {query_str} part in the template is a placeholder that gets replaced with the actual query string.

After defining query_wrapper_prompt like this, you can pass it to HuggingFaceLLM:

Plain Text
llm = HuggingFaceLLM(
    # other arguments...
    query_wrapper_prompt=query_wrapper_prompt,
    # other arguments...
)


If you're still facing issues, please refer to the LlamaIndex documentation () for more detailed information.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai
why i get this error only when using conda environment?
Let me check the knowledge base for you. One moment! :books:
----------------------
Hmm, I don't know enough to give you a confident answer yet. The error you're experiencing could be due to a variety of reasons, such as package conflicts, incorrect environment settings, or other issues specific to your Conda environment. Without more specific information about the error and your environment, it's hard to provide a definitive solution.

I recommend checking the Conda documentation () or the Python documentation () for more information. If you're using LlamaIndex, you might also find the LlamaIndex documentation () helpful.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@FTn did you figure this out? I feel like I replied to this in another thread, but I can't remember if that was you

Latest versions of llama index switched the query wrapper prompt type, it should just be a string now
Add a reply
Sign up and join the conversation on Discord