Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated last year
0
Follow
Gpt4all
Gpt4all
Inactive
0
Follow
A
AvishWagde
last year
Β·
I'm getting response as "ERROR: The prompt size exceeds the context window size and cannot be processed." while quering a document using LLM QA bot , how to solve this
https://stackoverflow.com/questions/76873456/error-the-prompt-size-exceeds-the-context-window-size-and-cannot-be-processed
L
A
3 comments
Share
Open in Discord
L
Logan M
last year
Isn't the max input size for gpt4all 2048? But you are setting it to 4096?
L
Logan M
last year
You probably also want to lower the chunk size in that case, maybe to 512
A
AvishWagde
last year
Thank you very much, yes the max size is 2048 and with it chunk size 512 will go, 1024 chunk size doesn't work.
Add a reply
Sign up and join the conversation on Discord
Join on Discord