Is the num_output in PromptHelper working correctly (with ListIndex)? my outputs continue to come out at 256 despite overriding them with a larger number.
oh the main thing you have to set is the LLM in langchain (see https://gpt-index.readthedocs.io/en/latest/how_to/custom_llms.html). The prompt helper num_output can be derived from the LLMPredictor (so you don't need to set prompthelper at all if you use openai, ai21, cohere), but otherwise you'll still have to define a prompt helper manually
thanks so much. i believe i have it working: i do see longer outputs in some of my prompts. i'm having a secondary problem where the particular prompt i'm making is causing the response to get cut off.
the problematic prompt is of the format: "Generate 4 multiple choice questions with 4 answer choices each and include detailed answer explanations." ends up truncating usually after outputting one question/answer explanation. for some reason, if don't answer for answer explanations, it successfully outputs all the questions.
anyway, long story short: max_tokens does seem to be working, so thank you!