Hello all, I am having an issue with my chatbot. This is the third one that I am making and with this one I am having issues with a recursion error. When prompting something like "Hi" its all good and well, but when prompting something specific that when answering it has to dive in the storage I'm getting the error. I could not find anything related on GitHub/StackOverflow.
The thing is, with this chatbot my storage is way larger then my previous chatbot. Could that be the issue? I couldn't find anything related to the recommended size of a vector store.
I have resolved the previous issue, but I'm facing a challenge with my current chatbot. I have too many indexes. every prompt gives this error:
openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 50940 tokens. Please reduce the length of the messages.