Find answers from the community

Updated 2 months ago

Vectors

I'm using chat engine to query a vector store. If no vectors exist in the vector store I dont get an error, which is a problem. It will simply send the query to the LLM with an empty context. How do I avoid this?
L
k
18 comments
What chat engine are you using?

(Probably you can also check the vector store for content beforehand too)
@Logan M
Plain Text
index.as_chat_engine(
        chat_mode=ChatMode.CONTEXT,
        condense_question_prompt=PromptTemplate(CHAT_PROMPT_TEMPLATE),
        chat_history=chat_history,
        agent_chat_response_mode="StreamingAgentChatResponse",
        similarity_top_k=20,
    )
i would have expected it to throw an error if there's no vectors at all
  1. Yea, here it would retrieve 0 vectors, leaving the system prompt without context, and the LLM would respond otherwise. Feel free to open a PR to change that behaviour
  2. btw, setting top-k=20 will probably break most LLMs in this chat mode, unless your chunks are small, or your LLM has a huge context window
  1. oh thats interesting, I dont think we've seen issues so far oddly :/
  1. Do you have idea ideas on how to work around that for now?
Are you using a vector db? Or just the default?
(sadly... daylight robbery)
oof -- does pinecone's api have a way to get a count of objects in an index? Thats what I would suggest, if possible
so theres no way to do something like index.vector_store.get_objects or something?
seems to be such a simple thing ๐Ÿ˜„ haha
youd be surprised that most vector db apis don't offer a way to do this lol
one option is just doing is_empty = len(index.as_retriever(similarity_top_k=1).retrieve("test"))
and "test" can be anything? ๐Ÿ˜ฎ
that simply goes to the vector db and fetches top_k 1? No LLM calls right?
exactly, test can be anything
Ahhh nice, this prob does the job for me. Nice one, thx for your help @Logan M
Add a reply
Sign up and join the conversation on Discord