Find answers from the community

Updated 4 months ago

kapa ai how to use LlamaIndex to access

At a glance

The post asks how to use LlamaIndex to access OpenRouter supported models and requests a code snippet. A community member responds that OpenRouter supports OpenAI's client API, and that the LlamaIndex project has added an OpenAILike LLM class which should support this integration, and that it will be available in the next release. Another community member expresses interest in discussing a more formal OpenRouter integration.

Useful resources
how to use LlamaIndex to access OpenRouter supported models, can you provide code snippit to do this?
d
a
2 comments
looks like openrouter supports OpenAI's client API. We just landed a OpenAILike LLM class which should support this directly: https://github.com/run-llama/llama_index/pull/7973

Should be available in the next release
@disiok interested in chatting about a more formal openrouter integration?
Add a reply
Sign up and join the conversation on Discord