Find answers from the community

Updated 9 months ago

How can I create a chat engine from the

At a glance

The community member is asking how to create a chat engine from a Retriever query engine. The first comment suggests using the CondenseQuestionChatEngine from the llama_index library. However, the second comment indicates that using a hybrid approach with a chat engine is facing more hallucination. The third comment suggests trying to customize the prompts, using a different chat engine with an agent and appropriate system prompt, or using the retriever with a condense + context chat engine with appropriate prompts. There is no explicitly marked answer in the comments.

How can I create a chat engine from the Retriever query engine?
Plain Text
query_engine = RetrieverQueryEngine.from_args(
                        retriever=hybrid_retriever,
                        node_postprocessors=[cohere_rerank],
                        llm=llm,
                    )
W
H
L
3 comments
Try with this:
Plain Text
from llama_index.core.chat_engine.condense_question import CondenseQuestionChatEngine

chat_engine = CondenseQuestionChatEngine(query_engine=query_engine,...)
Yes, I found it on the document.
But hybrid with chat engine facing more hallucination
You can try customizing some prompts (like how the question gets condensed), or try a different chat engine like using a query engine tool with an agent and appropriate system prompt, or using the retriever with a condense + context chat engine with appropraite prompts
Add a reply
Sign up and join the conversation on Discord