Find answers from the community

Updated 5 months ago

anyone have any ideas?

At a glance
anyone have any ideas?
T
C
3 comments
Is it a hard stop? Could be a token cutoff? πŸ€”
I figured it out, there was a default max_tokens settings in the llm provider
making that larger solved the issue
Add a reply
Sign up and join the conversation on Discord