QueryFusionRetriever
with CondensePlusContextChatEngine
, where i am having 2 retrievers BM25Retriever
and VectorStoreIndex.from_vector_store
and using langfuse for traces. When using condense plus context chat engine the traces are not well serrated like for multiple retrieverss, multiple queries and then fusion nodes. Just like well separated as in index.as_chat_engine
def get_chat_engine() -> "CondensePlusContextChatEngine": Settings.llm = OpenAI(model="gpt-4o", temperature=0.1) index = VectorStoreIndex.from_vector_store(vector_store=vector_store) retriever = QueryFusionRetriever( [ index.as_retriever(similarity_top_k=3), BM25Retriever.from_defaults(nodes=nodes, similarity_top_k=2, verbose=True), ], similarity_top_k=2, num_queries=2, mode="reciprocal_rerank", use_async=False, verbose=True, query_gen_prompt=BOT_QUERY_GEN_PROMPT, ) chat_engine = CondensePlusContextChatEngine.from_defaults( retriever=retriever, system_prompt=SUPPORT_BOT_SYSTEM_PROMPT, streaming=True ) return chat_engine