Find answers from the community

Updated 4 months ago

Chat engine

At a glance

The community members are discussing the purpose of using the chat_mode="openai" option when creating a chat engine. The main points are:

1. The openai mode allows the chat engine to use tools and query them to provide appropriate answers, but it's unclear how to create these tools.

2. One community member suggests that the openai mode is more reliable than the react mode, as it uses a function calling API instead of requiring complex instructions and parsing.

3. Another community member notes that the openai mode provides chat history, which is not available when directly querying the index.

However, there is no explicitly marked answer to the original question about the purpose of using the openai mode in a chat engine.

Useful resources
What is the point of using chat_mode="openai" when creating a chat engine? The whole point of this chat_mode is to be able to use tools, query them and then provide an appropriate answer? How do I create tools when creating a chat engine this way?
M referring to this article - https://gpt-index.readthedocs.io/en/latest/examples/chat_engine/chat_engine_openai.html
L
r
5 comments
So, a chat engine is just a quick way to "chat" with an index. One of those options is using the openai function calling api

If you want to use more tools, I would setup one of our dedicated openai agents
What m actually asking is if there's any point in providing this mode? Why would I use this route than directly querying my index?
I am guessing my chat engine itself becomes a tool in this case...
It provides chat history, compared to just querying your index, which is just a standalone search each time
But that will happen even if I just write - index.as_chat_engine() without specifying the "openai" mode right.! What is the edge added by this "openai" mode. Theoretically, it seems to be just an unnecessary overhead. I mean creating OpenAIAgents with multiple tools makes sense but m not able to understand why I would use "openai" mode in chat_engine.
yea, index.as_chat_engine() will use the openai mode automtaically (at least in latest versions) if you are using a supported LLM. Otherwise it uses react mode.

OpenAI mode is much more reliable than react mode, since the function calling api does not require complex instructuions and parsing like the react mode does
Add a reply
Sign up and join the conversation on Discord