Find answers from the community

s
F
Y
a
P
Updated last month

Dear fellow LlamaIndexers, is it

Dear fellow LlamaIndexers, is it possible to get a chat_engine from a query_engine ? I currently deriving a chat_engine from a vector index, but I would like to derive a chat engine from a router engine based on both vector index and summary index ?
L
G
3 comments
you sure can, although only an agent or condense question chat engine are possible (context chat engines work with retrievers)
Plain Text
from llama_index.core.chat_engine import CondenseQuestonChatEngine

chat_engine = CondenseQuestonChatEngine.from_defaults(query_engine, ...)


from llama_index.core.tools import QueryEngineTool
from llama_index.agent.openai import OpenAIAgent

tool = QueryEngineTool.from_defaults(query_engine, name="...", description="...")

agent = OpenAIAgent.from_tools([tool], ...)
Yes, I will go via RouterRetriever that I found out later!
Add a reply
Sign up and join the conversation on Discord