I'm currently using index.as_chat_engine() alongside Qdrant as a vector store.
I have a scenario where, if a vector lookup fails to return relevant search results, I need to send an error back to the client code interacting with this system.
By default, as_query_engine() / as_chat_engine() would still invoke the LLM without any context, from my understanding. How can I change this behaviour to return an error when no matches are found in the vector store?
Hi @WhiteFang_Jr Thank you for your response. However, I have a question. When there's no relevant match in the vector DB, the final response from either the query engine or chat engine always seems to be something like "It appears that there is no specific information available," which sounds like it was written by an LLM. So, even when there's no match in the DB, does it still go through the LLM, right?
Additionally, I logged the response object for a query where I knew there would be no relevant matches in the vector DB. It logged an answer similar to the one mentioned above, not a None.
cc @Logan M Tagging you in case you have some insight into this!