Find answers from the community

Updated 3 months ago

Can anyone please help me out here? I

Can anyone please help me out here? I want to save the memory object (into a file so that I can load it later), as shown in the following sample code-
llm = OpenAI(model="gpt-3.5-turbo", api_key=api_key) memory = ChatMemoryBuffer.from_defaults(token_limit=2000) index = create_index(text1, text2) chat_engine = index.as_chat_engine( llm = llm, chat_mode="context", memory=memory, prefix_messages=[ChatMessage(role="system", content="..."), ChatMessage(role="assistant", content="""Hello, and Welcome, please introduce yourself""")] ) for i in range(5): resp = chat_engine.chat("Hello my name is John").response
W
L
4 comments
You can get all the conversations happened till now with:

Plain Text
# get all the conversation
chat_conversations = memory.get()

# convert it into a dict format.
chat_dict = memory.dict()

# now save this dict into a json file

# one you reload your server, pass in the conversation whenchat happens
response = chat_engine.chat(message=message, chat_history=chat_history)
But in case of long conversations using chat_history could lead to token limit exceeded errors isn't it? Memory objects have a fixed max token limit.
True, I would suggest you store every conversation with you as well.

user -
bot -

and then can store it in a JSON file. This way you can have whole conversation records with you
Yeah I am storing that separately, but AFAIK ChatMemoryBuffer has a summarizer mechanism that allows to keep the entire chat history within the max token limits. That's why I wanted to be able to save the memory object as it is
Add a reply
Sign up and join the conversation on Discord