Find answers from the community

Updated 2 years ago

Hi I have a several TXT files that i

At a glance

The community member has several TXT files they want to query using GPT, and they want the answers to be in long form. The comments suggest that to achieve this, the community member should adjust the max_tokens and num_output settings, either when creating the index or when querying the data. The comments indicate that these settings are part of the "service context object" and should be passed in when loading the data from disk.

Hi, I have a several TXT files, that i want to query, I want GPT to answer in long form, my question is when im doing the index, should I change the output size so the answers that GPT gives are long form?
L
0
4 comments
If your answers aren't already being truncated, then the model is deciding to give short answers. You might have to also tell it to go more in depth (in addition to changing max_tokens and num_output
thanks, changing max_tokens and num_output this is done when creating the index?
or its done in the query
When creating the index, and all these changes go into different parts of the service context object.

And when you load from disk, make sure to pass the service context object back in too πŸ‘
Add a reply
Sign up and join the conversation on Discord