Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated 7 months ago
0
Follow
I am getting this error when i am trying
I am getting this error when i am trying
Inactive
0
Follow
P
PrashantK
7 months ago
ยท
I am getting this error when i am trying to use llama-2-13B-chat or mistral model from amazon bedrock..May I know , What should I do and how it will be resolved?
Attachment
W
1 comment
Share
Open in Discord
W
WhiteFang_Jr
7 months ago
Try upgrading llama-index core with
pip install --upgrade llama-index-core
Add a reply
Sign up and join the conversation on Discord
Join on Discord