Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 8 months ago
0
Follow
I am getting this error when i am trying
I am getting this error when i am trying
Inactive
0
Follow
P
PrashantK
8 months ago
Β·
I am getting this error when i am trying to use llama-2-13B-chat or mistral model from amazon bedrock..May I know , What should I do and how it will be resolved?
Attachment
W
1 comment
Share
Open in Discord
W
WhiteFang_Jr
8 months ago
Try upgrading llama-index core with
pip install --upgrade llama-index-core
Add a reply
Sign up and join the conversation on Discord
Join on Discord