Find answers from the community

Updated 2 years ago

how to integrate a model running on

At a glance

The community member is asking how to integrate a model running on another server with LlamaIndex. A community member suggests using the customLLM class from LlamaIndex to use a custom language model. They provide a reference to the LlamaIndex documentation for an example of using a custom LLM model.

Useful resources
how to integrate a model running on another server to llama-index?
Add a reply
Sign up and join the conversation on Discord