Find answers from the community

Updated 5 months ago

Hi guys, how to configure top_p and top_

Hi guys, how to configure top_p and top_k for vertex ai in llamaindex. Its not working when I pass it in additional_kwargs
W
_
5 comments
Maybe check what keyword vertext AI uses. and also how are you passing is also imp
@WhiteFang_Jr

llm = Vertex(
model=MODEL_NAME,
safety_settings=safety_settings,
project=credentials.project_id,
credentials=credentials,
temperature = TEMPERATURE,
max_tokens = MAX_TOKENS,
additional_kwargs={
"top_p" : 0
}
)

is this the correct way to pass it?
But you gotta make sure what keywords vertex uses for top_p and other, But yeah this is how you will add
Add a reply
Sign up and join the conversation on Discord