@Logan M can I get to know current available context size and memory used and max tokens used at runtime so that when it approaches the limit, I can reset the variables and the chat engine so that it doesn’t reach the limit and break with error
I have added custom prompt to the chat engine. But it seems to not work. I have named it as Bluebird, still it refers itself as AI language model @Logan M
I am currently building a chatbot that reads data from different pdfs. I use chatengine to retrieve queries. chat engine keeps the context, but when multiple people are using the bot at the same time, the context is shared in the same memory.
I want every session to have individual memory, currently user 1 is asking about x, user 2 asking about y, then when user 1 is asking about tell me more, it says more about y instead of x because the last context it has is about y. It is not taking into account individual user’s context. Can I please get some help on this @Logan M