Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 3 months ago
0
Follow
Response length
Response length
Inactive
0
Follow
M
Maximus
2 years ago
Β·
Does anyone else find that responses obtained through LLM+openAI API are too short ? Is it because most tokens are used by context ?
L
M
2 comments
Share
Open in Discord
L
Logan M
2 years ago
As long as they aren't cut off, it's just the LLM deciding to write short responses lol
Will take some prompt engineering to get it to write more.
Just keep in mind that the default max tokens is 256 for openai. You can change that though
https://gpt-index.readthedocs.io/en/latest/how_to/customization/custom_llms.html#example-fine-grained-control-over-all-parameters
M
Maximus
2 years ago
Thanks. I will test with different settings
Add a reply
Sign up and join the conversation on Discord
Join on Discord