Find answers from the community

Updated 2 years ago

How to increase the output token limit

At a glance

The community member is asking how to increase the output token limit, as their summaries often don't fit into 256 tokens and end abruptly. Another community member responds that they are using LangChain LLMs under the hood, and if they are using the OpenAI LLM, they need to increase the max_tokens parameter. They provide a link to a guide on how to feed the OpenAI LLM into an LLMPredictor for use with GPT Index.

Useful resources
How to increase the output token limit? My summaries often don't fit into 256 tokens and end abruptly
j
p
2 comments
we use langchain LLM's under the hood, so assuming you're using the OpenAI LLM you need to increase max_tokens. See https://gpt-index.readthedocs.io/en/latest/how_to/custom_llms.html for how to feed in the openai LLM into a LLMPredictor for use with gpt index (don't worry about prompt helper)
Add a reply
Sign up and join the conversation on Discord