Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 3 months ago
0
Follow
hi, I want to ask how to change the max
hi, I want to ask how to change the max
Inactive
0
Follow
X
Xiao
9 months ago
Β·
hi, I want to ask how to change the max token for local llm?
for example, the below is my demo code. how to change the maximum generate token?
L
X
2 comments
Share
Open in Discord
L
Logan M
9 months ago
Ollama(... additional_kwargs={"num_predict": 256})
additional kwargs takes any kwarg supported by ollama modelfiles api
X
Xiao
9 months ago
thank you
Add a reply
Sign up and join the conversation on Discord
Join on Discord