Find answers from the community

s
F
Y
a
P
Updated last month

So, if I want to build a bot that

So, if I want to build a bot that combines semantic search with lexical search (using QueryFusionRetriever), there's no way to incorporate chat history?
L
b
12 comments
Hmm, you can build a chatbot over anything

  • Throw the query fusion retriever into a retriever query engine, and use as a tool with an agent
  • Do the same above, but with condense question chat engine
  • Put the retriever directly into a context or condense+context chat engine
I'm looking at the docs, and I don't see that the RetrieverQueryEngine has a memory parameter.
right, the query engine is stateless
the memory is in the chat engine
(one of the options above)
Putting the retriever into a context chat engine is an interesting option. I'll look into it.
For example:

A react agent
Plain Text
from llama_index.core.tools import QueryEngineTool
from llama_index.core.agent import ReActAgent

tool = QueryEngineTool.from_defaults(
  query_engine,
  name="<name>",
  description="<some useful description>",
)

agent = ReActAgent.from_tools([tool], llm=llm, verbose=True)


A context chat engine (either one is similar in terms of creating)
Plain Text
from llama_index.core.chat_engine import ContextChatEngine, CondensePlusContextChatEngine

chat_engine = CondensePlusContextChatEngine.from_defaults(
  retriever,
  llm=llm,
)
I was looking at this page and didn't see the CondensePlusContextChatEngine option:

https://docs.llamaindex.ai/en/stable/examples/chat_engine/chat_engine_condense_plus_context
This should be helpful! πŸ‘ πŸ˜ƒ
yea it uses index.as_chat_engine -- been meaning to add examples of the actual class constructors instead of shortcuts
Class constructors are a huge help in the docs. I bet there are so many features and options with these engines I wouldn't even be aware of unless I scoured through the source code.
Add a reply
Sign up and join the conversation on Discord