Find answers from the community

Updated 2 months ago

Can we create index from a retriever ,

Can we create index from a retriever , want to do that because I want to pass the index created from custom retriever to Chat Engine - Condense Question Mode

Pls help
L
A
a
8 comments
You can create a query engine from your retriever

Plain Text
from llama_index.query_engine import RetrieverQueryEngine

query_engine = RetrieverQueryEngine.from_args(retriever)


from llama_index.chat_engine import CondenseQuestionChatEngine

chat_engine = CondenseQuestionChatEngine.from_defaults(query_engine)
I have another follow up query, is there a way to chain 3 different retrievers : vector retriever, key word retriever and bm25 retriever
def condense_context_question_chatbot_engine(index):

memory = ChatMemoryBuffer.from_defaults(token_limit=3900)
chat_engine = index.as_chat_engine(
chat_mode="condense_plus_context",
memory=memory,
system_prompt=(
"You are a helpful and friendly chatbot who addresses <your requirement here>"
"Here are the relevant documents for the context:\n"
"{context_str}"
"\nInstruction: Use the previous chat history, or the context above, to interact and help the user."
),
verbose=True,
)
return chat_engine
This is my function
I want to include memory buffer along with hybrid retriever, pls let me know how do I do that
I want to use Condense Plus Context Chat
classmethod from_defaults(retriever: BaseRetriever, service_context: Optional[ServiceContext] = None, chat_history: Optional[List[ChatMessage]] = None, memory: Optional[BaseMemory] = None, system_prompt: Optional[str] = None, context_prompt: Optional[str] = None, condense_prompt: Optional[str] = None, skip_condense: bool = False, node_postprocessors: Optional[List[BaseNodePostprocessor]] = None, verbose: bool = False, **kwargs: Any) β†’ CondensePlusContextChatEngine, i see this in documentation but not sure about the syntax
Add a reply
Sign up and join the conversation on Discord