Find answers from the community

Updated 2 months ago

okey, so i have my custom retriver to

okey, so i have my custom retriver to query my indexes and return Nodes, i want to combine it with a chatMemory and use it as a Chat engine, how would i go about doing that ? i dont want to use a response syntesiser since i want to keep my LLM calls to a minimum, ideas ?
L
h
2 comments
You can hook any retriever with a CondensePlusContextChatEngine or ContextChatEngine

Plain Text
from llama_index.chat_engine import CondensePlusContextChatEngine 

chat_engine = CondensePlusContextChatEngine.from_defaults(retriever, service_context=service_context, ...)
Add a reply
Sign up and join the conversation on Discord