I have the weirdest problem I've encountered so far. This piece of code works perfectly:
from openai import OpenAI
client = OpenAI(base_url="http://localhost:xxx", api_key="xxx")
response = client.embeddings.create(input=["Your string here"], model="Azure-Text-Embedding-ada-002")
embedding_vector = response.data[0].embedding
however, if I try to pass client through a class, like this:
class CustomEmbedding:
def __init__(self, client, model="Azure-Text-Embedding-ada-002"):
self.client = client
self.model = model
def embed(self, texts):
response = self.client.embeddings.create(input=texts, model=self.model)
return [r.embedding for r in response.data]
[...]
client = OpenAI(base_url="http://localhost:xxx", api_key="xxx")
embed_model = CustomEmbedding(client, model="Azure-Text-Embedding-ada-002")
I get that :
AttributeError: 'OpenAI' object has no attribute 'embeddings'
And I checked. self.client really does not have an embeddings attribute inside the class ,but it does have it outside of it. I literally have no clue what is going on. I need to wrap the client in this CustomEmbedding class, as I have no idea how else to define a llamaindex service_context (without a local openai key). Any ideas?