Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
😞
😐
😃
Powered by
Hall
Inactive
Updated 3 months ago
0
Follow
Increase tokens
Increase tokens
Inactive
0
Follow
Q
Quentin
2 years ago
·
I added this parameter, but it doesn't seem to work very well, the response is still often truncated, whether it is language-related?
Attachment
L
Q
2 comments
Share
Open in Discord
L
Logan M
2 years ago
Try also setting
max_tokens=num_output
in then
OpenAI
definition
Also, you'll want to use the ChatOpenAI class from langchain for gpt 3.5 turbo 👍
Q
Quentin
2 years ago
thanks @Logan M ChatOpenAI had fix my issue.I will try another way you mentioned later.
Add a reply
Sign up and join the conversation on Discord
Join on Discord