What is the point of using chat_mode="openai" when creating a chat engine? The whole point of this chat_mode is to be able to use tools, query them and then provide an appropriate answer? How do I create tools when creating a chat engine this way? M referring to this article - https://gpt-index.readthedocs.io/en/latest/examples/chat_engine/chat_engine_openai.html
What m actually asking is if there's any point in providing this mode? Why would I use this route than directly querying my index? I am guessing my chat engine itself becomes a tool in this case...
But that will happen even if I just write - index.as_chat_engine() without specifying the "openai" mode right.! What is the edge added by this "openai" mode. Theoretically, it seems to be just an unnecessary overhead. I mean creating OpenAIAgents with multiple tools makes sense but m not able to understand why I would use "openai" mode in chat_engine.
yea, index.as_chat_engine() will use the openai mode automtaically (at least in latest versions) if you are using a supported LLM. Otherwise it uses react mode.
OpenAI mode is much more reliable than react mode, since the function calling api does not require complex instructuions and parsing like the react mode does