Find answers from the community

Updated 4 months ago

is there way to use a remote LLM but

At a glance

The community member's post asks if there is a way to use a remote large language model (LLM) while running anything locally. In the comments, another community member suggests using the LlamaIndex custom LLM abstraction to set up a remote LLM. However, there is no explicitly marked answer in the provided information.

Useful resources
is there way to use a remote LLM but have anything run locally?
W
1 comment
Add a reply
Sign up and join the conversation on Discord