Find answers from the community

Updated 11 months ago

I have the weirdest problem I've

I have the weirdest problem I've encountered so far. This piece of code works perfectly:

Plain Text
from openai import OpenAI
client = OpenAI(base_url="http://localhost:xxx", api_key="xxx")
response = client.embeddings.create(input=["Your string here"], model="Azure-Text-Embedding-ada-002")
embedding_vector = response.data[0].embedding


however, if I try to pass client through a class, like this:

Plain Text
class CustomEmbedding:
    def __init__(self, client, model="Azure-Text-Embedding-ada-002"):
        self.client = client
        self.model = model

    def embed(self, texts):
        response = self.client.embeddings.create(input=texts, model=self.model)
        return [r.embedding for r in response.data]
    [...]

client = OpenAI(base_url="http://localhost:xxx", api_key="xxx")
embed_model = CustomEmbedding(client, model="Azure-Text-Embedding-ada-002")

I get that :

Plain Text
AttributeError: 'OpenAI' object has no attribute 'embeddings'

And I checked. self.client really does not have an embeddings attribute inside the class ,but it does have it outside of it. I literally have no clue what is going on. I need to wrap the client in this CustomEmbedding class, as I have no idea how else to define a llamaindex service_context (without a local openai key). Any ideas?
L
b
4 comments
that is pretty weird. It seems easier to fix your service context issue lol
There is support for both azure embeddings and azure llms
I was doing something stupid :

in the second one I was doing from llamaindex import openai ... hence the different call. I found out after I wasted time debugging the rest of the code 😦
Add a reply
Sign up and join the conversation on Discord