Find answers from the community

s
F
Y
a
P
Updated last month

i figured out how to increase the max

i figured out how to increase the max_tokens so that my answers don't get cut off. however, i don't see a way to do this if you load_from_disk your index (without creating the index in the session from scratch)....am i missing something?
f
j
3 comments
like if you simply do:

Plain Text
index = GPTSimpleVectorIndex.load_from_disk("my\\file\\path\\index.json")
response = index.query("this is my question")
print(response)


is there any way to increase max_tokens?
you can also pass in the llm_predictor to index.query (and the llm_predictor wraps around the llm class which has max_tokens set):

https://gpt-index.readthedocs.io/en/latest/how_to/custom_llms.html#example-changing-the-number-of-output-tokens-for-openai-cohere-ai21
is the reference, but you can do something like index.query(..., llm_predictor=llm_predictor)
ahh perfect, thank you!
Add a reply
Sign up and join the conversation on Discord