Find answers from the community

Updated 2 months ago

Logan M when we create a chat engine

when we create a chat engine, how is the "chat history" persisted? just like we have so many DB integrations available for vector store, what about the conversation history? How do we handle building a chat bot for a large organisation on a dataset and deploying it on azure?
L
1 comment
we are working on adding some better features for persisting chat history

In the latest version of llama-index, you can do

Plain Text
json_str = chat_engine._memory.to_json()
memory = ChatMemoryBuffer.from_json(json_str)


There's also to/from dict as well
Add a reply
Sign up and join the conversation on Discord