Find answers from the community

Updated 3 months ago

Chat engine

What is the point of using chat_mode="openai" when creating a chat engine? The whole point of this chat_mode is to be able to use tools, query them and then provide an appropriate answer? How do I create tools when creating a chat engine this way?
M referring to this article - https://gpt-index.readthedocs.io/en/latest/examples/chat_engine/chat_engine_openai.html
L
r
5 comments
So, a chat engine is just a quick way to "chat" with an index. One of those options is using the openai function calling api

If you want to use more tools, I would setup one of our dedicated openai agents
What m actually asking is if there's any point in providing this mode? Why would I use this route than directly querying my index?
I am guessing my chat engine itself becomes a tool in this case...
It provides chat history, compared to just querying your index, which is just a standalone search each time
But that will happen even if I just write - index.as_chat_engine() without specifying the "openai" mode right.! What is the edge added by this "openai" mode. Theoretically, it seems to be just an unnecessary overhead. I mean creating OpenAIAgents with multiple tools makes sense but m not able to understand why I would use "openai" mode in chat_engine.
yea, index.as_chat_engine() will use the openai mode automtaically (at least in latest versions) if you are using a supported LLM. Otherwise it uses react mode.

OpenAI mode is much more reliable than react mode, since the function calling api does not require complex instructuions and parsing like the react mode does
Add a reply
Sign up and join the conversation on Discord