Find answers from the community

Updated 2 years ago

define a custom PromptHelper and set max

define a custom PromptHelper and set max_chunk_overlap=0 (you can see an example here https://gpt-index.readthedocs.io/en/latest/how_to/custom_llms.html)
M
j
4 comments
Are those also the defaults? Or how do I see the defaults per each LLM?
by default the chunk overlap is 1/10 of whatever the max_input_size is, up to 200
so by default not related to chunk_size_limit?
Add a reply
Sign up and join the conversation on Discord