Find answers from the community

Updated 9 months ago

Regarding: ChatMemoryBuffer

Regarding: ChatMemoryBuffer
Example:
memory = ChatMemoryBuffer.from_defaults(token_limit=1500)

What are the pros/cons of making the token_limit large, even very large?
thanks
W
1 comment
Pros: Your chat buffer will keep more old conversation and it will help in forming the next query from llm based on past conversation better. or help in answering better via llm using previous conversation.

Cons: All llms have a context window or token limit right like gpt3.5 has 2048 or 4096 token limit. Now if you increase the token limit very large, your next query might surpass the token limit for llm thus causing error from llm size.

By any chance your token limit is not breached let say out of 4096 4000 token got used to place the next query. now llm will only have 96 tokens to generate the possible answer. which would also result in error or incomplete answer
Add a reply
Sign up and join the conversation on Discord