Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated 2 months ago
0
Follow
Accessing endpoint models in serverless inference api
Accessing endpoint models in serverless inference api
Inactive
0
Follow
d
datadaba
2 months ago
ยท
I can see for the inference api which is serverless but let's say I want to access endpoint models how to do it .
W
L
d
4 comments
Share
Open in Discord
W
WhiteFang_Jr
2 months ago
I believe this can help you:
https://docs.llamaindex.ai/en/stable/examples/llm/huggingface/#using-hugging-face-text-generaton-inference
L
Logan M
2 months ago
the inference api works for endpoint models too actually
d
datadaba
2 months ago
Thanks guys
d
datadaba
2 months ago
Thanks mate
Add a reply
Sign up and join the conversation on Discord
Join on Discord