The community member asked if conversational memory can be added to the llama-index. Another community member responded that yes, the Chat Engine can be used in place of the query engine to enable conversational memory, and provided a link to the relevant documentation. The original poster thanked the other community member and said they would look into it.