Find answers from the community

Updated 8 months ago

Hello! I found I often have the

Hello! I found I often have the following exception:

Plain Text
        "  File \"/var/app/venv/staging-LQM1lest/lib64/python3.8/site-packages/llama_index/core/llms/llm.py\", line 252, in predict\n    messages = self._get_messages(prompt, **prompt_args)\n",
        "  File \"/var/app/venv/staging-LQM1lest/lib64/python3.8/site-packages/llama_index/core/llms/llm.py\", line 198, in _get_messages\n    messages = prompt.format_messages(llm=self, **prompt_args)\n",
        "  File \"/var/app/venv/staging-LQM1lest/lib64/python3.8/site-packages/llama_index/core/prompts/base.py\", line 211, in format_messages\n    prompt = self.format(**kwargs)\n",
        "  File \"/var/app/venv/staging-LQM1lest/lib64/python3.8/site-packages/llama_index/core/prompts/base.py\", line 196, in format\n    prompt = self.template.format(**mapped_all_kwargs)\n",
        "KeyError: \"'name'\"\n"
    ],

It's thrown here:
Plain Text
response = llm.predict(Prompt(some_prompt))

I can't figure out why, the prompt is not empty. Thanks!
L
S
12 comments
Seems like you didn't supply a kwarg?

If I had to guess

llm.predict(Prompt(some_prompt), var1=val, val2=....)
Need to provide the template args for your prompt
What should I put there? In most cases, it works without any additional parameters
The prompt itself doesn't have any params inside
Well it depends on your prompt, and if it has variables. If the prompt has json or curly braces, those probably need to be esacaped
{{ text }} double brackets will escape
No, the prompt doesn't have it
Can you share the prompt that causes this?
Itd be easier if I could reproduce this
Sure, let me run the code and copy what's going on there
Ah wait I guess I know what's going on. The variable of prompt in some cases may have a variable inside
Add a reply
Sign up and join the conversation on Discord