Find answers from the community

Updated 3 months ago

hi, I want to ask how to change the max

hi, I want to ask how to change the max token for local llm?

for example, the below is my demo code. how to change the maximum generate token?
L
X
2 comments
Ollama(... additional_kwargs={"num_predict": 256})

additional kwargs takes any kwarg supported by ollama modelfiles api
Add a reply
Sign up and join the conversation on Discord