Find answers from the community

Updated last month

Updating llama-index version and required fixes

After a long time we've updated our llama-index version on python and we are doing the required fixes.

We were using synchronous Opensearch client but now it looks that it must be asynchronous, correct?
v
L
6 comments
he wad this connection_class that handles requests in synchronous way... but it looks that the new versions of llama-index can only handle asynchronous requests...
Attachment
image.png
I'm not sure where you got that impression. All sync and async methods are supported and implemented in the source code
The vector client does not have a connection class kwarg anymore
ooof looks like llama-index-vector-stores-opensearch needed an upgrade on my side.

Appreciate that!
Ah gotcha, nice!
Should have asked the version you had for that first πŸ˜…
Add a reply
Sign up and join the conversation on Discord