Find answers from the community

Updated 4 months ago

Hi everyone, I was trying to use Claude

At a glance

The community member was trying to use Claude 3 but encountered an error stating that the model was unknown. They updated the llama-index package to the latest version, and were able to use Claude 2.1, but found it took too long compared to GPT-3.5. The comments suggest the community member should update the Anthropic package, check the supported models, and migrate their code out of the legacy imports. However, there is no explicitly marked answer, and the community members continue to face issues with using the Claude 3 model.

Useful resources
Hi everyone, I was trying to use Claude 3 but fail. The error is "Unknown model: claude-3-opus-20240229. Please provide a valid Anthropic model name.Known models are: claude-instant-1, claude-instant-1.2, claude-2, claude-2.0, claude-2.1". I updated llama-index to lastest version: 0.10.20. I test claude-2.1, my code run and return result but it take too long compare too gpt-3.5.
1
W
c
L
17 comments
Did you update anthropic package:
pip install llama-index-llms-anthropic
When i import Anthropic with legacy, i can not use claude 3. Without legacy, i face a problem about call_back: "ValueError: Cannot use llm_chat_callback on an instance without a callback_manager attribute."
Are you using other legacy imports? You should be using all legacy or none at all.

How did you setup the llm?
I import all from legacy, except Anthropic. When I import Anthropic with legacy and use claude-3 version like haiku, i face error: {'type': 'invalid_request_error', 'message': '"claude-3-haiku-20240307" is not supported on this API. Please use the Messages API instead.'} bot when i call chat/ complete. I didnot face this error when import without legacy. Which message api should i use - i only see api message to prompt ... here.
I've not seen this issue? You had the latest version? pip install -U llama-index-llms-anthropic ?
yes, i already install it
But you updated to the latest?
And updated the anthropic client itself? pip install -U anthropic ?
And imported with from llama_index.llms.anthropic import Anthropic ?
from llama_index.legacy.llms.anthropic import Anthropic
llm = Anthropic(
model="claude-3-haiku-20240307",
)
resp = llm.complete("What is meaning of life?")

i imported from legacy
I don't think it will work with legacy
It's legacy for a reason πŸ˜…
Thank you for supporting. Maybe i need to migrate my code out of legacy.
I don't believe Haiku is supported on the API yet, as it says, I was unsing Sonnet and switched to Haiku and got hte same error.
It's availble via Vertex, but it looks like llama-index hasn't updated their vertex llm package to support it yet.
I migrate my code out of legacy and can call API to gemini, claude 3 with chat mode now. But i need to install each package, then i face conflict package problem, pip install try to solve dependencies and take a while.
Add a reply
Sign up and join the conversation on Discord