Find answers from the community

Updated 4 months ago

Llms

At a glance
does llama-index support gemeni pro experimental or command-r/r+
L
d
8 comments
Yes, there's a Gemini LLM class (I believe it's also supported through Vertex)
Command r is offered over coheres api, there's also an LLM for that
oh dang i tried using command R through cohere class it no bueno
im probably messing something up
Hmm, it should have worked, but you might need the latest package -- pip install -U llama-index-llms-cohere
is it from llama_index.llms.cohere import Cohere

llm = Cohere(model="command-R", api_key=api_key)
Add a reply
Sign up and join the conversation on Discord