Find answers from the community

Updated 6 months ago

Chat engine

At a glance

The community members are discussing an issue where the "chat" method of an OpenAIAgent does not work with a "CondensePlusContextChatEngine" object, despite the method being named "chat". The community members note that a query engine and a chat engine are different objects with different interfaces, and that the "agent.chat" method is designed to work with query engines, not chat engines.

The community members suggest that the community member should use the "index.as_query_engine" method instead of the "index.as_chat_engine" method, as the query engine is the appropriate tool for querying the data. They also mention that the agent can choose whether to use a tool or the chat history alone to respond to a query, and that the community member can "force" the agent to use a specific tool, but this may lead to unwanted behavior.

The community members also suggest that the community member may need to tune the tool name and description to get the agent to work best with their data, and that the community member can wrap any function in a tool to use it with the agent.

Useful resources
Hello! When I'm trying to use an agent with QueryEngineTool to look for answer in the context, there is an exception and I see some inconsistence.
Despite the method is called "chat" for the agent, it doesn't accept engine created with "as_chat_engine" method:

Plain Text
query_engine = index.as_chat_engine(chat_mode='condense_plus_context', 
                                                similarity_top_k=similarity_top_k, 
                                                llm=llm_engine,
                                                system_prompt=prepared_system_prompt)

query_tool =  QueryEngineTool(
            query_engine=query_engine,
            metadata=ToolMetadata(
                name="query_tool",
                description=self.query_description,
            ),
        )
tools.append(query_tool)
agent = OpenAIAgent.from_tools(
            tools, llm=llm_engine, 
            verbose=True, 
            system_prompt=self.system_prompt
        )
response = agent.chat(query_text, chat_history=chat_history) # <====== Exception

The exception is "Got output: Error: 'CondensePlusContextChatEngine' object has no attribute 'query'"
L
S
15 comments
That's right, a chat engine is not a query engine. They are two different objects with different interfaces
Yeah but I suppose that the "chat" method of the agent object should work with the engines created as chats. If it doesn't work with chat engines, why is the method called "chat"? And how to provide keeping the conversation and knowing the chat history without using a chat engine?
I'm not sure I follow.

A query engine is stateless. It has a .query() method

A chat engine is stateful, with chat history. It has a .chat() method
Exactly. The agent.chat method works only with query_engine, not a chat engine, right? It sounds pretty inconsistent
I still don't think I know what the issue or inconsistency is here 🫠
agent.chat - doesn't work with chat_engine, only query_engine. How to make agent.chat to work with stateful, supporting chat history engine?
agent.chat is already statefull, it is a chat engine
But I want the question to be asked against my data, how can I do it? - with agent.
Its already being asked against your data.

An agent looks at the chat history and list of tools. And decides to respond using a tool (i.e. a query engine) or using the chat history alone. This allows it to respond to normal things like "Hi" and also leverage the tool for questions that involve your data

You can "force" it to use a specific tool at least once by passing in the tool name. But this might lead to some awkward/unwanted behavior

agent.chat("...", tool_calls="query_engine_tool")
Thanks! Sorry but I don't follow. I don't want to force any tool. I want one of the tools to query my data. Should I just use the index.as_query_engine instead?
Maybe! It really depends on what you want.

An agent will have your index as one of the tools. But it chooses whether or not it needs to use it
Ok, I think I've got it. finally πŸ˜† Is QueryEngineTool the only tool that could be used against my data?
You can also wrap any function in a tool too πŸ‘€

Plain Text
def my_fn_name(input1: str) -> str:
  """This is used as the tool description."
  return input1

from llama_index.core.tools import FunctionTool

tool = FunctionTool.from_defaults(fn=my_fn_name)
Add a reply
Sign up and join the conversation on Discord