Find answers from the community

Updated last year

for open-source models like the ones

At a glance
for open-source models like the ones hosted on huggingface llamaindex downloads them locally, right?
E
D
4 comments
That's right! and you have the option to use InferenceAPI if you want as well
prefere inference api, could you share the doc link to this
remotely_run
Attachment
image.png
Add a reply
Sign up and join the conversation on Discord