Find answers from the community

Updated 6 months ago

hi, I want to ask how to change the max

At a glance

The community member is asking how to change the maximum number of tokens generated by a local large language model (LLM). In the comments, another community member suggests using the additional_kwargs parameter of the Ollama model and setting num_predict to 256 to change the maximum number of generated tokens. There is no explicitly marked answer in the comments.

hi, I want to ask how to change the max token for local llm?

for example, the below is my demo code. how to change the maximum generate token?
L
X
2 comments
Ollama(... additional_kwargs={"num_predict": 256})

additional kwargs takes any kwarg supported by ollama modelfiles api
Add a reply
Sign up and join the conversation on Discord