Find answers from the community

Updated 3 months ago

Multiple responses

Does llama_index chat_engine can handle multiple request at the same time?
I have encountered that if i make two request at the exact same time the responses get messy. Seems like its returning pieces of texts from the different requests.

If not What would be the best possible solution to address this?

Thanks for the time
L
F
6 comments
It should, although only in the very latest version 0.7.22

Just fixed a bug for this on the weekend
Ok, i im using version 0.7.17. I will update it and keep you informed.
Thanks Logan
Version 0.7.22 solved the issue. Tested with +10 request at the same time
@Logan M u Rock!
Awesome! πŸ‘
Add a reply
Sign up and join the conversation on Discord