Find answers from the community

Updated 9 months ago

I'm interested in using

I'm interested in using AutoMergingRetriever but I need it to function as a chat engine and to stream chat responses. My existing chat engine code is as follows:

Plain Text
chat_engine = index.as_chat_engine(
        similarity_top_k=similarity_top_k,
        node_postprocessors=node_postprocessors,
        vector_store_kwargs={"qdrant_filters": filters})


I'm unsure how to integrate AutoMergingRetriever with the chat functionality. The documentation (https://docs.llamaindex.ai/en/latest/examples/retrievers/auto_merging_retriever.html) suggests using RetrieverQueryEngine, but that would only provide me with a query engine. How can I get a chat engine?
L
S
4 comments
You can put the retriever into a RetrieverQueryEngine, and then give that as a tool to an agent
Thanks @Logan M , do you have any code examples to show how to do this?
Plain Text
from llama_index.core.query_engine import RetrieverQueryEngine
from llama_index.core.tools import QueryEngineTool, ToolMetadata
from llama_index.agent.openai import OpenAIAgent

tool = QuerEngineTool.from_defaults(
  RetrieverQueryEngine.from_args(retriever, ...),
  metadata=ToolMetadata(name="name", description="Useful for finding information about X.")
)

agent = OpenAIAgent.from_tools([tool], llm=llm)
Add a reply
Sign up and join the conversation on Discord