Find answers from the community

Updated 3 months ago

Hi,

Hi,

I'm currently using index.as_chat_engine() alongside Qdrant as a vector store.

I have a scenario where, if a vector lookup fails to return relevant search results, I need to send an error back to the client code interacting with this system.

By default, as_query_engine() / as_chat_engine() would still invoke the LLM without any context, from my understanding. How can I change this behaviour to return an error when no matches are found in the vector store?
W
S
4 comments
I think if you dont get relevent results in the form of nodes , You do get an response object but response.response is None.

There is no LLM calls in that state
You can check on this and then throw the error
Hi @WhiteFang_Jr Thank you for your response. However, I have a question. When there's no relevant match in the vector DB, the final response from either the query engine or chat engine always seems to be something like "It appears that there is no specific information available," which sounds like it was written by an LLM. So, even when there's no match in the DB, does it still go through the LLM, right?

Additionally, I logged the response object for a query where I knew there would be no relevant matches in the vector DB. It logged an answer similar to the one mentioned above, not a None.

cc @Logan M Tagging you in case you have some insight into this!
I figured this out!

I just needed a retriever:

index.as_retriever()

And then if the retriever returns zero nodes (for retriever.retrieve(prompt)) , I can proceed to not do a query
Add a reply
Sign up and join the conversation on Discord