Find answers from the community

Updated 3 months ago

Increase tokens

I added this parameter, but it doesn't seem to work very well, the response is still often truncated, whether it is language-related?
Attachment
image.png
L
Q
2 comments
Try also setting max_tokens=num_output in then OpenAI definition

Also, you'll want to use the ChatOpenAI class from langchain for gpt 3.5 turbo 👍
thanks @Logan M ChatOpenAI had fix my issue.I will try another way you mentioned later.
Add a reply
Sign up and join the conversation on Discord