When working with chromadb directly, and loading the index frm VectorStoreIndex.from_vector_store(), I get the following error when using the chat_repl()
chromadb.errors.InvalidDimensionException: Embedding dimension 768 does not match collection dimensionality 384
I am using OpenAI as the LLM, im assuming this is because when i do chroma_collection.upsert() (via there API) that this uses their default embedding model which doesn't match the the dimensions that OpenAI expects?
yea so theres two models in llama-index, the LLM and the embedding model
It looks like whichever embedding model you are using in llama-index is not the same as the embedding model that created the index. These need to be the same
Worked! If anyone else is using chromadb in a different pipeline, be sure to set HuggingFaceEmbeddings(model_name="sentence-transformers/all-MiniLM-L6-v2")