Find answers from the community

Updated 5 months ago

Multiple responses

At a glance

The community member is experiencing an issue with the llama_index chat_engine where it returns messy responses when multiple requests are made at the same time. Another community member suggests that this issue was fixed in version 0.7.22, and the community member using version 0.7.17 plans to update to the latest version to resolve the problem. The community members confirm that version 0.7.22 solved the issue and that the system can handle multiple requests at the same time.

Does llama_index chat_engine can handle multiple request at the same time?
I have encountered that if i make two request at the exact same time the responses get messy. Seems like its returning pieces of texts from the different requests.

If not What would be the best possible solution to address this?

Thanks for the time
L
F
6 comments
It should, although only in the very latest version 0.7.22

Just fixed a bug for this on the weekend
Ok, i im using version 0.7.17. I will update it and keep you informed.
Thanks Logan
Version 0.7.22 solved the issue. Tested with +10 request at the same time
Awesome! πŸ‘
Add a reply
Sign up and join the conversation on Discord