Find answers from the community

Updated 4 months ago

Logan M Why don t we have the provision

At a glance

The community member is facing an issue with the response_synthesizer object and the chat_engines in their code. They are also interested in using a text_qa_template with the chat_engine object. Other community members provide some suggestions, such as using the OpenAIAgent class and the QueryEngineTool to create the chat engine, and initializing the response_synthesizer directly with the index.as_chat_engine() method. However, there is no explicitly marked answer to the original question.

Why don't we have the provision of providing a response_synthesizer object with chat_engines? It threw me an error when I passed it? Let's say I want to mention a text_qa_template for chat_engine object? how to do that?
L
r
3 comments
Yea, **kwargs need better handling, it's on the todo list

What chat engine are you using? You can create initialize it from the actual chat engine class. The kwargs you are passing will go into the query engine πŸ‘

Plain Text
from llama_index.agent import OpenAIAgent
from llama_index.tools.query_engine import QueryEngineTool

# convert query engine to tool
query_engine = index.as_query_engine(...)
query_engine_tool = QueryEngineTool.from_defaults(query_engine=query_engine)

# create agent/chat engine
agent = OpenAIAgent.from_tools(tools=[query_engine_tool], llm=llm)
What I did was simple - index.as_chat_engine(response_synthesizer = response_synthesizer ). Basically, the default chat mode.
Right, and the default chat mode is OpenAIAgent :PSadge: Hence the error, and why its on the todo list lol
Add a reply
Sign up and join the conversation on Discord