Find answers from the community

Updated 9 months ago

Prompt

For the query_engines in Llama Index, I know it takes in a qa_template / refine_template etc but is that equivalent to the chat_engine's system message?
How is the qa_template different from the chat_engine's system prompt?
The qa_template takes in the query str so the user's question is merged with the prompt.
But for chat_engine it's 2 separate messages of system message + user message? πŸ€”
W
c
2 comments
You can check the prompt that are being used inside your chat engine following this:

https://docs.llamaindex.ai/en/stable/module_guides/models/prompts/usage_pattern/#accessing-prompts


Also to check what is being sent to LLM, you can check observability: https://docs.llamaindex.ai/en/stable/module_guides/observability/#simple-llm-inputsoutputs
just want to confirm, that both query engine and chat engine uses the CHAT API not the Completion API for OpenAI correct?
Add a reply
Sign up and join the conversation on Discord