Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 3 months ago
0
Follow
is there way to use a remote LLM but
is there way to use a remote LLM but
Inactive
0
Follow
L
Lucas
7 months ago
Β·
is there way to use a remote LLM but have anything run locally?
W
1 comment
Share
Open in Discord
W
WhiteFang_Jr
7 months ago
You can use LlamaIndex custom LLM abstraction to setup your remote LLM:
https://docs.llamaindex.ai/en/stable/module_guides/models/llms/usage_custom/#example-using-a-custom-llm-model-advanced
Add a reply
Sign up and join the conversation on Discord
Join on Discord