Hello! I found I often have the following exception:
" File \"/var/app/venv/staging-LQM1lest/lib64/python3.8/site-packages/llama_index/core/llms/llm.py\", line 252, in predict\n messages = self._get_messages(prompt, **prompt_args)\n",
" File \"/var/app/venv/staging-LQM1lest/lib64/python3.8/site-packages/llama_index/core/llms/llm.py\", line 198, in _get_messages\n messages = prompt.format_messages(llm=self, **prompt_args)\n",
" File \"/var/app/venv/staging-LQM1lest/lib64/python3.8/site-packages/llama_index/core/prompts/base.py\", line 211, in format_messages\n prompt = self.format(**kwargs)\n",
" File \"/var/app/venv/staging-LQM1lest/lib64/python3.8/site-packages/llama_index/core/prompts/base.py\", line 196, in format\n prompt = self.template.format(**mapped_all_kwargs)\n",
"KeyError: \"'name'\"\n"
],
It's thrown here:
response = llm.predict(Prompt(some_prompt))
I can't figure out why, the prompt is not empty. Thanks!