I’m working on memory management using LlamaIndex and have created an agent (Function Calling Agent). I need your help in choosing the best approach for my use case.
In LlamaIndex, is it possible to store previous questions and answers as memory within an agent?
For example: Question: What happens if you multiply 2 and 3? Answer: Provided by the agent.
Can we store this interaction in the agent's memory before asking another question, so the agent retains the context for future queries?