Find answers from the community

Updated 6 months ago

is there a build in way to handle

At a glance

The community member is asking if there is a built-in way to handle conversation history and control the size of the chat history. A community member responds that there is the ChatEngine, which is a stateful version that keeps conversation history, and the ChatMemoryBuffer can be used to control the size of the chat history. The original community member thanks the other community member for the information.

Useful resources
is there a build in way to handle conversation history? I have kind of build my own thing but I guess that's not how it is supposed to be πŸ™‚ asked that, is there a way to count tokens so that history will not grow to big?
T
o
2 comments
Yeah there is the ChatEngine which is a stateful version (keeps conversation history): https://docs.llamaindex.ai/en/stable/module_guides/deploying/chat_engines/root.html

You can use ChatMemoryBuffer to control the size of the chat history: https://docs.llamaindex.ai/en/stable/api_reference/memory.html#llama_index.core.memory.ChatMemoryBuffer
thank you very much πŸ™‚
Add a reply
Sign up and join the conversation on Discord