Find answers from the community

Updated 6 months ago

Llms

At a glance

The community members discuss the support for Gemini Pro experimental and Command-R/R+ in the llama-index library. One community member confirms that there is a Gemini LLM class, and that it may also be supported through Vertex. Another community member mentions that Command-R is offered through the Cohere API, and there is also an LLM for that. However, one community member had trouble using Command-R through the Cohere class, and another suggests that the latest package update (pip install -U llama-index-llms-cohere) might be needed. The community members also provide some code examples and a link to the library's source code, but there is no explicitly marked answer to the original question.

Useful resources
does llama-index support gemeni pro experimental or command-r/r+
L
d
8 comments
Yes, there's a Gemini LLM class (I believe it's also supported through Vertex)
Command r is offered over coheres api, there's also an LLM for that
oh dang i tried using command R through cohere class it no bueno
im probably messing something up
Hmm, it should have worked, but you might need the latest package -- pip install -U llama-index-llms-cohere
is it from llama_index.llms.cohere import Cohere

llm = Cohere(model="command-R", api_key=api_key)
Add a reply
Sign up and join the conversation on Discord