Find answers from the community

Updated last year

has anyone seen cross talk for multi

has anyone seen "cross talk" for multi user streaming response? i'm seeing when 2 simulatneous streaming chat calls cause the response streams to cross talk.
L
s
22 comments
What llama-index cersion do you have?
we fixed this recently
try updating to the latest, I think the fix was after that version
there was a shared queue lol whoops
yeah figured it was using the same http connection
thought i'd check before i jump into dig through and fix
that did it. thanks!
while i have you looking, i always have to install aim again because of sqlalchemy 2.0 conflict. i'm not even using aim anywhere but it's still trying to initialize
is there a workaround for the reinstall every upgrade of llama index?
Hmm, maybe move aim to be after llama index in your requirements txt?
yeah that's what i'm doing and it works. just strange since i don't initialize aim explicitly. it must be initializing somewhere. i may take a look to see if i can be removed and only initialized as needed
surprised nobody else has had that problem
it's init.py in callbacks module that explicitly imports aim
I've never noticed the issue before πŸ€”
If you aren't using aim, then I would just not install it?

The import will fail gracefully at least
Attachment
image.png
yeah that's right just uninstall works
i had it running for a while but then decided to stop using but never uninstalled
anyway i hope they upgrade to sqlalchemy 2.0 soon
thanks @Logan M
Add a reply
Sign up and join the conversation on Discord