Find answers from the community

Updated 3 months ago

how can I choose the best embed model

how can I choose the "best" embed_model for my use of case?
L
e
3 comments
They are all pretty similar tbh

But you can always take a look at the leaderboard
https://huggingface.co/spaces/mteb/leaderboard
its necessary to choose one for chroma. If I dont choose, the default one would be the one from openai? I am using openai..
LlamaIndex does not use the embedding model from chroma, it will always use the one from the service context. By default, that is openai yes, and it's not bad.

If you set embed_model="local" in the service context, it will use BAAI/bge-small-en running locally, which is also really good and fast in my experience (especially if you have cuda installed)
Add a reply
Sign up and join the conversation on Discord