Find answers from the community

Updated 7 months ago

is there a way to use the HuggingFace

At a glance

The community member is asking if there is a way to use the HuggingFace Inference API to generate embeddings, and they have a private endpoint deployed that they would like to use for Jina AI. Another community member suggests looking at the documentation from LlamaIndex, which provides an example of text embedding inference. However, it's unclear if the LlamaIndex solution can be used with a private inference URL, or if it only works with a local model.

Useful resources
is there a way to use the HuggingFace Inference API to generate embeddings ? I have a private endpoint deployed i woudl like to use for jinaai
W
d
3 comments
but can you point that at an inference URL ? or does it only work on a local model ?
oh i see you run that as a server
Add a reply
Sign up and join the conversation on Discord