Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated 2 months ago
0
Follow
Error context length
Error context length
Inactive
0
Follow
l
leeo
2 years ago
ยท
I keep getting this error. error_message="This model's maximum context length is 4097 tokens, however you requested 4146 tokens (3890 in your prompt; 256 for the completion)
L
l
2 comments
Share
Open in Discord
L
Logan M
2 years ago
Try decreasing the
chunk_size_limit
in the service context object
service_context = ServiceContext.from_defaults(..., chunk_size_limit =3000)
l
leeo
2 years ago
thanks
Add a reply
Sign up and join the conversation on Discord
Join on Discord