Find answers from the community

Updated 4 months ago

kapa ai how to use LlamaIndex to access

At a glance
how to use LlamaIndex to access OpenRouter supported models, can you provide code snippit to do this?
d
a
2 comments
looks like openrouter supports OpenAI's client API. We just landed a OpenAILike LLM class which should support this directly: https://github.com/run-llama/llama_index/pull/7973

Should be available in the next release
@disiok interested in chatting about a more formal openrouter integration?
Add a reply
Sign up and join the conversation on Discord