Find answers from the community

Updated 4 months ago

Does anyone know how to get the new

At a glance

The community member is having trouble getting the new OpenAI embedding model to work, encountering a TypeError when trying to use the get_text_embedding() method. The comments suggest that the community member should update their OpenAI client, which will automatically use the larger size of the embedding model (3k dimensions vs 1.5k). After updating the client, the issue is resolved, as indicated by the final comment "thanks, it works".

Does anyone know how to get the new OpenAI embedding model to work:
Plain Text
embed_model = OpenAIEmbedding(model="text-embedding-3-large", dimensions=3072)


---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
Cell In[29], line 1
----> 1 embeddings = embed_model.get_text_embedding("Your text here")
2 print(embeddings)

File ~/anaconda3/envs/embeddings/lib/python3.11/site-packages/llama_index/core/embeddings/base.py:207, in BaseEmbedding.get_text_embedding(self, text)
196 """
197 Embed the input text.
198
(...)
202 predefined instructions can be found in embeddings/huggingface_utils.py.
203 """
204 with self.callback_manager.event(
205 CBEventType.EMBEDDING, payload={EventPayload.SERIALIZED: self.to_dict()}
206 ) as event:
--> 207 text_embedding = self._get_text_embedding(text)
209 event.on_end(
210 payload={
211 EventPayload.CHUNKS: [text],
212 EventPayload.EMBEDDINGS: [text_embedding],
213 }
214 )
216 return text_embedding
...
134 return (
--> 135 client.embeddings.create(input=[text], model=engine, **kwargs).data[0].embedding
136 )

TypeError: Embeddings.create() got an unexpected keyword argument 'dimensions'


Does the vector store automatically use the larger number (3k vs. 1.5k) of dimensions?
L
G
4 comments
Update your openai client
pip install -U openai
it will automatically use the larger size, and you can adjust from there
Add a reply
Sign up and join the conversation on Discord