Find answers from the community

l
leeo
Offline, last seen 3 months ago
Joined September 25, 2024
I keep getting this error. error_message="This model's maximum context length is 4097 tokens, however you requested 4146 tokens (3890 in your prompt; 256 for the completion)
2 comments
l
L