Find answers from the community

Updated last year

@Logan M

@Logan M

How to increse the context length I'm getting this error

[1] C:\Users\MSI\test_chatbot\backend\node_modules\openai\error.js:43
[1] return new BadRequestError(status, error, message, headers);
[1] ^
[1]
[1] BadRequestError: 400 This model's maximum context length is 4097 tokens. However, your messages resulted in 4153 tokens. Please reduce the length of the messages.
L
s
3 comments
You can increase from 4097, some other issue is causing this input error

Probably depends on other settings and what you are doign
I made this chatbot using create-llama
is there any blog or something out there which explain the codebase to understand it better
Add a reply
Sign up and join the conversation on Discord