Find answers from the community

Updated 5 months ago

Hi,

At a glance
Hi,
I want to use a custom embeddings provided by embeding servet via https. I found in documentation https://docs.llamaindex.ai/en/stable/examples/embeddings/custom_embeddings/ , but it didnt show how to connect it to something in internet. Does anyone have example?
W
P
11 comments
You can do something like this:

Plain Text
    def _get_query_embedding(self, query: str) -> List[float]:
        # Send the POST request, data will be as per your embedding server standard
response = requests.post(url, json=data)
        embeddings = response.json()
        return embeddings
thanks. one more question. how i can check what will be sent to URL?
You can have your own logger and log it what is being sent to the URL
but what is what i will send to URL, is it data?
Yes, but it will totally depend on the URL on how it takes the data
some takes it as JSON body and some take it as param
I just try it with code:

def _get_query_embedding(self, query: str) -> List[float]:
url = "https:ddddd.com"
headers = {
"Content-Type": "application/json",
"Api-Key": "xxxxxx",
}
data = {
"text": "Ala ma kota"
}

response = requests.post(url, headers=headers, json=data, verify=False)
response.json()
return embeddings[0]
im calling code like:
embed_model = InstructorEmbeddings(embed_batch_size=2)
and as output I have:

No sentence-transformers model found with name sentence-transformers/hub. Creating a new one with mean pooling.

OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like sentence-transformers/hub is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
looks like it is ignoring code and trying to reach hugging face, can you help?
Add a reply
Sign up and join the conversation on Discord