Find answers from the community

Updated 10 months ago

@Logan M Getting an error when updating

@Logan M Getting an error when updating Prompts:

Hello again, still trying to get RAG to work with 100 percent german prompts to use an open source LLM to keep patient data secret as best as possible.

After running
prompts_dict = query_engine.get_prompts() print(list(prompts_dict))

i found that i was using two prompts: ['response_synthesizer:text_qa_template', 'response_synthesizer:refine_template']

Changing the first prompt worked great and my answers became german more often, however, when i tried to change the refine_template prompt I get an error:

File "/home/fabian/Desktop/RAG/.venv/lib/python3.10/site-packages/llama_index/llms/llm.py", line 165, in _get_messages
messages = prompt.format_messages(llm=self, prompt_args) File "/home/fabian/Desktop/RAG/.venv/lib/python3.10/site-packages/llama_index/prompts/base.py", line 185, in format_messages prompt = self.format(kwargs)
File "/home/fabian/Desktop/RAG/.venv/lib/python3.10/site-packages/llama_index/prompts/base.py", line 170, in format
prompt = self.template.format(**mapped_all_kwargs)
KeyError: 'existing_answer'

My Code currently looks like this:
Attachment
Screenshot_from_2024-01-22_17-45-14.png
f
L
2 comments
I am beyond dumb. Leaving this here for anyone who runs into the same problem in the future. The line should have said:
query_engine.update_prompts( {"response_synthesizer:text_qa_template": qa_prompt_tmpl,"response_synthesizer:refine_template":refine_prompt} )
Add a reply
Sign up and join the conversation on Discord