Find answers from the community

Updated 4 months ago

Accessing endpoint models in serverless inference api

At a glance

The community member is asking how to access endpoint models, as they are familiar with the serverless inference API. Another community member suggests that the inference API can also be used for endpoint models, and provides a link to an example of using the Hugging Face text generation inference. The other comments express gratitude, but do not provide a direct answer to the original question.

Useful resources
I can see for the inference api which is serverless but let's say I want to access endpoint models how to do it .
W
L
d
4 comments
the inference api works for endpoint models too actually
Add a reply
Sign up and join the conversation on Discord