Find answers from the community

Updated 3 months ago

Retrieving relevant documents from llama index with chromadb vector store

Hi I am working on RAG with llama index and chromadb as vector store, while querying I am trying to retrive the document used to answer the query, issue with doing response.source_nodes is that even if I have query like I have in the image, it would still provide me document where such thing is not mentioned at all, so I am not sure how to fix this. Is there an alternative option to make it work with my requirement?
Attachment
image.png
W
p
7 comments
Does your document contains info about vector embedding?
nope it does not, so like I am wondering if there is no info about vector embedding then why is it showing the document in the node
It may catched something similar in the given node. As your query is matched with the docs using cosine similairty.
You can use node_postprocessors like https://docs.llamaindex.ai/en/stable/module_guides/querying/node_postprocessors/node_postprocessors/#similaritypostprocessor

This eliminates node below the set threshold value. This way if it picks anything below the threshold value it will remove it.
ahh okay, because same would come even if my query is "hi" so i think using this node_postprocessors should help
Yea this should help
I don' t think so this is helpful tbh, because many of my actual questions which are from the doc have similarity sciore less than 70 so I am not really sure if this will help
do you have any other way?
Add a reply
Sign up and join the conversation on Discord