Find answers from the community

S
Sergey
Offline, last seen 3 months ago
Joined September 25, 2024
Hello. I apologize if this is a trivial question, but I'm having difficulty with it. Is it possible to create a ServiceContext that can access a remote LLM (I am using LlamaCPP with the built-in CustomLLM implementation)?

Currently, I am working in a standalone environment where the index and model are in the same process. However, I now want to run the LLM server on a separate PC. Are there any pre-existing adapters available, or should I develop this adapter myself?
5 comments
S
W