Find answers from the community

Updated 3 months ago

Cut off

Hey there, I'm playing with Llama and the GPTSimpleVectorIndex. So far, I've been able to create a custom index and query it. but when I query the index, it looks like the answer has been cut off. I think that maybe it is something with the max_token value. Should I use LLMPredictor to set a bigger max_tokens value?

It would be great if you could point me on the right path here.
L
1 comment
Yea definitely, the default from openai is 256 tokens

You can increase this like so:
https://gpt-index.readthedocs.io/en/latest/how_to/customization/custom_llms.html#example-changing-the-number-of-output-tokens-for-openai-cohere-ai21

If you are also using a prompt helper, make sure num_output and max_tokens are the same (if you aren't setting the prompt helper, this happens automatically)
Add a reply
Sign up and join the conversation on Discord