Find answers from the community

Updated 3 months ago

Hi all @kapa.ai#2237 @Logan M

Hi all

I have an index for querying, and after the query is done, I want to retrieve a few queries related to the previous query. How can I do that?.

I using RetrieverQueryEngine
W
B
12 comments
Thank @WhiteFang_Jr . I have another question: I use CondenseQuestionChatEngine to add history along with query_engine = RetrieverQueryEngine. But when I asked many questions, the answers were not as good as when using query_engine.query. Here is the code:

query_engine = RetrieverQueryEngine.from_args(retriever, streaming=True, text_qa_template=prompt_tmpl, service_context=service_context)
chat_engine = CondenseQuestionChatEngine.from_defaults(
query_engine=query_engine,
verbose=False,
chat_history=custom_chat_history
)
response = chat_engine.stream_chat(input_text)
CondenseQuestionChatEngine transforms your query based on the chat history that you provide and uses the condensed query for querying.
I think that could be the reason for not getting better results
In your opinion, when should I use it?
@WhiteFang_Jr @Logan M

How to catch events when the chatbot cannot respond?
You will get a response object in which you can check whether there is a response returned from llm or not.
If there is no response then response.response will be equal to None. That way you can handle this case
If you do not want to condense your queries. You can pass condense = False while creating the condense chat engine
I see it answers something similar to: "Sorry. The content is not in the context provided".
Yeah that means that it is not able to find anything relevant to your query in the context.
How to catch that event?
Maybe modifying prompt to include certain thing in your response if LLM is not able to generate answer using the context and then catching that.
Add a reply
Sign up and join the conversation on Discord