It seems like llama_index shares the char memory across different chat engines created in my application. How can I then use my index and use it to create a chatbot application that can interact with multiple users at the same time?
You can also optionally manage the history for each user. Something like chat_engine.chat_history to get the current chat history, and pass it in with .chat(..., chat_history=chat_history)
Hmm I pass my own memory as I want to limit it for less characters. That could be the problem? I need to create a different memory for each engine as well perhaps