LangChain offer the ability to save the chat context into a file and then load it back into the chain using load_memory_variables. Are they any alternatives for that in llama_index. I'm aware of the ReAct Mode, however i use a aws lambda, i would have to load the chat variables from a file.