Find answers from the community

Updated 4 weeks ago

Kwargs

Hello everyone, is there some documentation on the **kwargs of LLM.complete ? Typically, I am trying to use some parameters on the fly, within the messagemethod such as LLM.complete(prompt=prompt, temperature=1.0, output_tokens=300) etc., instead of defining the LLM object with these parameters. Is there a way to do so? I haven't been able to figure this out despite quite a bit of time on the matter.
L
A
3 comments
The kwargs are typically sent directly into the llm api being used (i.e maybe the openai api)

It's basically endless and changes per llm
I have tried this without success, I'll try with more care then, thank you
Actually after looking closely I've been able to make it work. For the record (and for Gemini that I am using right now) you need to specify LLM.complete(prompt, generation_config = {'temperature: 1.0, 'other argument': something}). What I was missing is the generation_configkeywork. Thank you for your help.
Add a reply
Sign up and join the conversation on Discord