Find answers from the community

Updated 2 months ago

Chat engine

It seems like llama_index shares the char memory across different chat engines created in my application. How can I then use my index and use it to create a chatbot application that can interact with multiple users at the same time?
L
t
6 comments
Just create a chat engine per user

You can also optionally manage the history for each user. Something like chat_engine.chat_history to get the current chat history, and pass it in with .chat(..., chat_history=chat_history)
I am creating a chat_engine for each user. That's why I wonder. But I'll try to pass my own chat history
I am using the context chat engine
Yea, creating a chat engine for each user will create its own independent memory. Very weird
Hmm I pass my own memory as I want to limit it for less characters. That could be the problem? I need to create a different memory for each engine as well perhaps
Yes that was the problem. When I create the ChatMemory for each chat engine it works correctly
Add a reply
Sign up and join the conversation on Discord