Find answers from the community

Updated 3 months ago

Hi, I am trying to implement a chat

Hi, I am trying to implement a chat engine. From what I understand, if I use index.as_chat_engine(), I can't use my custom retriever or query engine. Is that correct? Is there a way to implement a custom chat engine? Can I just use CondenseQuestionChatEngine myself?
L
2 comments
Yea you can just instansiate any chat engine or agent directly
Plain Text
from llama_index.core.chat_engine import CondensePlusContextChatEngine

chat_engine = CondensePlusContextChatEngine.from_defaults(retriever, llm=llm, ...)
Add a reply
Sign up and join the conversation on Discord