Find answers from the community

Updated 9 months ago

Hello LlamaIndex community,

At a glance
Hello LlamaIndex community,

I'm currently integrating the Command R+ model from Azure Marketplace into our systems using LlamaIndex. I understand that there is some support for this model in LlamaIndex, but it might not be fully optimized to leverage all its capabilities. Specifically, the original Cohere client handles the "documents" field by adding chunks separately, which allows for internal optimizations by the model. However, it seems like in LlamaIndex, context needs to be added directly into the prompt.

Could someone confirm if LlamaIndex fully supports the Command R+ model, especially in terms of handling separate document inputs for RAG? If not, are there any plans to update this support or should I consider custom implementations to fully utilize the model's capabilities?

Example from the Cohere docs:
co.chat( model="command", message="Where do the tallest penguins live?", documents=[ {"title": "Tall penguins", "snippet": "Emperor penguins are the tallest."}, {"title": "Penguin habitats", "snippet": "Emperor penguins only live in Antarctica."}, {"title": "What are animals?", "snippet": "Animals are different from plants."} ])
L
U
5 comments
The cohere llm integration is being updated to handle this actually
Thanks for the update, @Logan M ! Do you have an estimated timeline for when the updated integration might be released? I was considering implementing a custom solution, but if it's just around the corner, I might wait for the official update. 😊
Hopefully today!
Great to hear that, Logan! Looking forward to the update then. Thanks for the quick response! 😊
Add a reply
Sign up and join the conversation on Discord