Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated 9 months ago
0
Follow
Hi guys, sorry quick question, is there
Hi guys, sorry quick question, is there
Inactive
0
Follow
m
max
9 months ago
ยท
Hi guys, sorry quick question, is there a way to set a global top_k, top_p, max_tokens in Settings.llm? I'm using Mistral and couldn't find anything in the documentation (generate_kwarg and additional_kwarg seems not to work)
W
m
5 comments
Share
Open in Discord
W
WhiteFang_Jr
9 months ago
top_p and max_tokens are part of LLM , setting it while initialising your LLM makes more sense.
For top_K, you can set it in the query_engine directly!.
Are they not working when you set it there?
m
max
9 months ago
Thanks for your reply!
It's ok for the top_k,
I agree with you about setting top_p and max_tokens while initialising my LLM that's why I would like to use it like this :
"Settings.llm = MistralAI(api_key=api_key, model="mistral-small", top_p = 0.90, max_tokens = 300) "
There is another way ?
W
WhiteFang_Jr
9 months ago
You can set max_tokens like how you have defined here.
For others you can pass it as a dict, mentioned here:
https://github.com/run-llama/llama_index/blob/d63fec1c69a2e1e51bf884a805b9fd31ad8d1ee9/llama-index-integrations/llms/llama-index-llms-mistralai/llama_index/llms/mistralai/base.py#L72
W
WhiteFang_Jr
9 months ago
keyword should be exact:
additional_kwargs
m
max
9 months ago
Thanks a lot !
Add a reply
Sign up and join the conversation on Discord
Join on Discord