Find answers from the community

Updated 3 months ago

Updating llama-index version and required fixes

At a glance

The community member who posted the original post has updated their llama-index version on Python and is doing the required fixes. They were using a synchronous Opensearch client, but now it seems that it must be asynchronous. The comments discuss this issue, with one community member stating that all sync and async methods are supported and implemented in the source code, while another mentions that the vector client no longer has a connection class kwarg. The community members also discuss the need to upgrade the llama-index-vector-stores-opensearch package.

After a long time we've updated our llama-index version on python and we are doing the required fixes.

We were using synchronous Opensearch client but now it looks that it must be asynchronous, correct?
v
L
6 comments
he wad this connection_class that handles requests in synchronous way... but it looks that the new versions of llama-index can only handle asynchronous requests...
Attachment
image.png
I'm not sure where you got that impression. All sync and async methods are supported and implemented in the source code
The vector client does not have a connection class kwarg anymore
ooof looks like llama-index-vector-stores-opensearch needed an upgrade on my side.

Appreciate that!
Ah gotcha, nice!
Should have asked the version you had for that first πŸ˜…
Add a reply
Sign up and join the conversation on Discord