Find answers from the community

Updated 3 months ago

is there a build in way to handle

is there a build in way to handle conversation history? I have kind of build my own thing but I guess that's not how it is supposed to be πŸ™‚ asked that, is there a way to count tokens so that history will not grow to big?
T
o
2 comments
Yeah there is the ChatEngine which is a stateful version (keeps conversation history): https://docs.llamaindex.ai/en/stable/module_guides/deploying/chat_engines/root.html

You can use ChatMemoryBuffer to control the size of the chat history: https://docs.llamaindex.ai/en/stable/api_reference/memory.html#llama_index.core.memory.ChatMemoryBuffer
thank you very much πŸ™‚
Add a reply
Sign up and join the conversation on Discord