Find answers from the community

s
F
Y
a
P
Updated 2 months ago

I'm also having some difficulty calling

I'm also having some difficulty calling my llm from bedrock using the same code previously - 7 pydantic validation errors 😦
L
t
10 comments
pydantic v2 migration be rough
Make sure everything is updated. If its too borked, don't be afraid to use a fresh venv
hmm these error messages suggest that there are many things that are missing - but this code works on LlamaIndex 0.10.68. region_name is definitely there, as is access_key_id and scret_access_key.

And this was from creating a fresh env. Do you face this error?
Attachment
image.png
lame, pydanticv2 is catching errors that pydanticv1 never triggered
Will have to patch the llm
Sorry @Logan M I also found the error relating to llama-index-embeddings-azure-openai (thread: https://github.com/run-llama/llama_index/issues/15575#issuecomment-2308001583).

The reason is that the latest version of this library only supports up to python 3.11 and I'm using python 3.12. So when I try upgrading the library it chooses the last python 3.12 supported version which uses llama-index 0.10.68 as its core. Of course I could just change to python 3.11 for everything but my project is in active development using python 3.12.

Possible to extend support to python 3.12?
Oh weird. Would just be a simple patch in the pyproject.toml to allow python3.12
yeah precisely! haha! I was just wondering if there was a reason why 3.12 wasn't supported
Add a reply
Sign up and join the conversation on Discord