so a lot of this comes down to how chat works. There's only so many approaches to including chat history that make sense
data agents (default) -- this chat engine uses indexes as "tools", and when a user sends a message, the LLM has to decide whether to use a tool based on their names/descriptions or just answer the user without it
context -- every user message is used to retrieve nodes from an index, and insert those nodes as part of the system prompt. Then the LLM reads that + the chat history to answer a query
condense question -- every user message is transformed into a new query based on the chat history, and then that query is sent to the query engine and an answer is returned
For agents and context, setting a system prompt can help quite a bit. Something like system_prompt="Answer without any prior knowledge, use a tool/function when unsure how to answer a question." .
Also, setting good descriptions/tool names helps for agents
@Logan M @jerryjliu0 , I can see a lot of info on chat engine on unstructured data (PDF). But I want to built a chat engine with structured data (SQL/pandas) . I can use query engines on sql and pandas, but a chat engines is needed for a conversation on tabular data. I cant find any example , got stucl. Can you guys assist here. TIA π