Find answers from the community

Updated 3 months ago

Error context length

I keep getting this error. error_message="This model's maximum context length is 4097 tokens, however you requested 4146 tokens (3890 in your prompt; 256 for the completion)
L
l
2 comments
Try decreasing the chunk_size_limit in the service context object

service_context = ServiceContext.from_defaults(..., chunk_size_limit =3000)
Add a reply
Sign up and join the conversation on Discord