Find answers from the community

Updated 3 months ago

Response length

Does anyone else find that responses obtained through LLM+openAI API are too short ? Is it because most tokens are used by context ?
L
M
2 comments
As long as they aren't cut off, it's just the LLM deciding to write short responses lol

Will take some prompt engineering to get it to write more.

Just keep in mind that the default max tokens is 256 for openai. You can change that though https://gpt-index.readthedocs.io/en/latest/how_to/customization/custom_llms.html#example-fine-grained-control-over-all-parameters
Thanks. I will test with different settings
Add a reply
Sign up and join the conversation on Discord