Find answers from the community

Home
Members
bavquant
b
bavquant
Offline, last seen 4 months ago
Joined September 25, 2024
I am trying to use llamaindex for a local rag system using ollama based nomic embeddings, ollama based llama3 as llm and using a locally persistent vector store like chroma. problem appears to be that chroma useses its own embedding, and I am not sure how to pass the ollama based nomic embeddings to the chroma vector store. Result is, that when I do my rag, the similarity index is way off, my hypothesis due to the different embeddings used... any help or advise for a different vector store that I can run locally and which also allows filtering by metadata highly appreciated..
12 comments
b
L