The community members discuss the support for Gemini Pro experimental and Command-R/R+ in the llama-index library. One community member confirms that there is a Gemini LLM class, and that it may also be supported through Vertex. Another community member mentions that Command-R is offered through the Cohere API, and there is also an LLM for that. However, one community member had trouble using Command-R through the Cohere class, and another suggests that the latest package update (pip install -U llama-index-llms-cohere) might be needed. The community members also provide some code examples and a link to the library's source code, but there is no explicitly marked answer to the original question.