The community member's post asks how to persist "chat history" when creating a chat engine, similar to how there are many database integrations available for vector stores. They also inquire about handling the building and deployment of a chatbot for a large organization on a dataset, specifically on Azure.
In the comments, another community member responds that they are working on adding better features for persisting chat history. They mention that in the latest version of llama-index, you can use json_str = chat_engine._memory.to_json() and memory = ChatMemoryBuffer.from_json(json_str) to handle this, and there are also to/from dict options available.
when we create a chat engine, how is the "chat history" persisted? just like we have so many DB integrations available for vector store, what about the conversation history? How do we handle building a chat bot for a large organisation on a dataset and deploying it on azure?