Find answers from the community

Updated 2 months ago

I have created embeddings using local

I have created embeddings using "local:BAAI/bge-small-en-v1.5" and stored in Elasticsearch. I am using OpenAI for the query engine. However, I got this error
Plain Text
BadRequestError: BadRequestError(400, 'search_phase_execution_exception', 'failed to create query: the query vector has a different dimension [1536] than the index vectors [384]') 
. My question is whether I should look for a Huggingface model with vector dimension = 1536 or there is a way to query an VectorStore whose vector dimension is different to the vector dimension used in the query engine?
L
a
10 comments
It sounds like you aren't using bge for the queries πŸ‘€
yes, I want to use OpenAI.
where instead to use SimpleDirectoryReader, I read data from a Elasticsearch index
but the query is done using OpenAI as far I can understand
You need to use the same embeddings model for queries as you did to create the index

But remember, theres two models -- The LLM and the embedding model

The LLM can change at any time. But the embedding model has to be constant

I'm guessing you want to use openai as the LLM, which is fine, but you need to continue using bge for your embeddings
If you can share your code, I can probably fix it
Yes, you were right, it is working now!
Add a reply
Sign up and join the conversation on Discord