Find answers from the community

Updated 3 months ago

Embedding

I have the following answer with the prevous code

Plain Text
The author didn't mention what they did growing up. The context only talks about the author's experiences as an adult, such as painting, working on web apps, and starting companies. There is no information about their childhood or growing up years.


by the way, if I replace "sentence-transformers/all-MiniLM-L6-v2" with "BAAI/bge-base-en-v1", I have the expected answer

Plain Text
The author wrote short stories and tried writing programs on the IBM 1401 computer in 9th grade.


I want to know if sentence-transformers/all-MiniLM-L6-v2 is not compatible with llama-index or if we need to adjust other parameters to make it work. Thank you in advance for your suggestions.
L
l
4 comments
That embedding model is not great πŸ˜… you can try decreasing the chunk size while indexing (Maybe 512?) Or use another model? I usually use some BGE model
yeah, it seems like the only conclusion possible. I saw in the documentation that text longer than 256 words is truncated. I tried to reduce the chunk size to 256, but it still doesn't work. too bad, I wanted to use a lightweight multilingual model, but I'll settle for BAAI/bge-m3
thanks for your time anyway @Logan M
bge-small-en-v1.5 is probably a good choice for a tiny model imo
Add a reply
Sign up and join the conversation on Discord