Find answers from the community

Updated 9 months ago

Retriever

Hi everyone,

Is there a way to combine chat engine ("condense_plus_context" mode) functionality with auto-retrieval (OpenAIAssistantAgent/OpenAIAgent)? In the examples and tutorials it always seems like either one or the other. The closest thing seems to be chat engine with chat_mode="openai", but I can't see a way to plug in my custom auto_retrieve_tool.
L
d
4 comments
You can put a retriever into a condense plus context chat engine, or wrap it into a query engine and give it as a tool to an agent

Just have to Instansiate without as_chat_engine shortcuts

Plain Text
from llama_index.core.chat_engine import CondensePlusContextChatEngine

engine = CondensePlusContextChatEngine.from_defaults(retriever, llm=llm, ...)
@Logan M Could you please give a quick example how I should go about linking OpenAIAssistantAgent with CondensePlusContextChatEngine? Not entirely clear here...
oh with openaiassistant, not entirely sure. I think you'd want to wrap the chat engine as a tool for the openai assistant agent?

I thought you meant combining an auto-retriever + condense plus context chat engine lol
@Logan M Yeah, I was looking for a way to combine auto-retrieval (taking advantage of metadata filtering in my case) as described here - https://docs.llamaindex.ai/en/latest/examples/agent/openai_assistant_query_cookbook/ with a chat engine (to maintain conversational state).
Add a reply
Sign up and join the conversation on Discord