Find answers from the community

Updated 3 months ago

The query part of my code looks like

The query part of my code looks like this actually:

chat_engine = index_ready.as_chat_engine( verbose=True, system_prompt=system_prompt, condense_question_prompt=custom_prompt, chat_history=history, similarity_top_k=3, ) response = chat_engine.chat(data["question"])

When I do a query, sometimes it print this on console:

=== Calling Function === Calling function: query_engine_tool with args: { "input": "my query text condensed" } Got output: Some text. ========================

and show the correct answer, but sometimes it just dont print anything on the console and show an unrelated answer that is not based on the documents I inserted, im going crazy with this, anyone know what can be wrong?
L
M
2 comments
Since you aren't specifying a chat mode, it's default to using an agent. The condense question prompt is not used in this case

An agent decides whether or not to use the "query engine tool" based on the chat history + the description of the query engine

You can probably get better performance if you setup the agent and query engine tool yourself, with proper tool descriptions.
https://docs.llamaindex.ai/en/stable/examples/agent/openai_agent_with_query_engine.html
@Logan M When doing a query it outputs:

=== Calling Function === Calling function: query_engine_tool with args: { "input": "condensed question" } Got output: An answer for the condensed question. ========================

The condensed question is the right question that I want, and it seems to be used on the query itself, as the bot answer my question correctly (the answer is not the value from the printed "Got output:"), but it seems that it is using the value from the "Got output:" to get the node sources, so it is getting the wrong node sources, do you know what I'm doing wrong?
Add a reply
Sign up and join the conversation on Discord