Find answers from the community

Updated 6 months ago

I am wondering may I use custom LLM

At a glance

The community member is wondering if they can use a custom LLM (Large Language Model) or embedding API deployed by themselves as the underlying model in the llamaindex library. Another community member responds that this is possible, and provides links to the relevant documentation on using custom LLM and embedding models in llamaindex. The second community member also expresses gratitude for the information.

Useful resources
I am wondering may I use custom LLM/embedding API (deployed by myself) for underlying LLM/embedding in llamaindex?
Add a reply
Sign up and join the conversation on Discord