Find answers from the community

Home
Members
Ulan Yisaev
U
Ulan Yisaev
Offline, last seen 3 months ago
Joined September 25, 2024
Hello LlamaIndex community,

I'm currently integrating the Command R+ model from Azure Marketplace into our systems using LlamaIndex. I understand that there is some support for this model in LlamaIndex, but it might not be fully optimized to leverage all its capabilities. Specifically, the original Cohere client handles the "documents" field by adding chunks separately, which allows for internal optimizations by the model. However, it seems like in LlamaIndex, context needs to be added directly into the prompt.

Could someone confirm if LlamaIndex fully supports the Command R+ model, especially in terms of handling separate document inputs for RAG? If not, are there any plans to update this support or should I consider custom implementations to fully utilize the model's capabilities?

Example from the Cohere docs:
co.chat( model="command", message="Where do the tallest penguins live?", documents=[ {"title": "Tall penguins", "snippet": "Emperor penguins are the tallest."}, {"title": "Penguin habitats", "snippet": "Emperor penguins only live in Antarctica."}, {"title": "What are animals?", "snippet": "Animals are different from plants."} ])
5 comments
U
L
Hi everyone,
I was planning to use LLMLingua, especially after seeing its promising improvements and integrations like with LlamaIndex. However, I noticed it's been moved to legacy status, making some useful links unavailable. Can someone explain why LLMLingua is now in legacy? I'm hesitant to start using it without understanding the reasons behind this change.
Thanks!
1 comment
L