Find answers from the community

Updated 5 months ago

Is as_chat_engine ignoring both

At a glance

The community member is experiencing an issue with the as_chat_engine function, where it seems to be ignoring both parameters. They have tried using the system_prompt as well, but the issue persists across different language models (OpenAI, Vertex, and LLaMA).

In the comments, another community member suggests that the default chat engine is an agent, and the qa/refine prompts are in the query engine. The agent decides whether or not to use the query engine based on the chat history and list of tools + descriptions. The commenter recommends tweaking things from a lower level and skipping the as_chat_engine() function, as it may hide too many details and parameters.

There is no explicitly marked answer in the provided information.

Useful resources
Is as_chat_engine ignoring both parameters?
We've tried with system_prompt too with no success, it happens not only with OpenAI, but also with Vertex and LLaMA, we're missing something
L
1 comment
The default chat engine is an agent.

The qa/refine prompts are in the query engine.

The agent decides whether or not to use the query engine based on the chat history and list of tools + descriptions.

You might get better performance by tweaking things from a lower level and skipping as_chat_engine() (which hides too many details/parameters imo)

https://docs.llamaindex.ai/en/stable/examples/agent/openai_agent_with_query_engine.html
Add a reply
Sign up and join the conversation on Discord