Find answers from the community

Updated 8 months ago

Hi guys, what's the simplest way to

Hi guys, what's the simplest way to check the active llm model for a query?
W
m
5 comments
You want to check what llm is being used to generate response?
I know, the default is gpt3.5, but sometimes I'm changing it and would be nice to double-check
And if I'm setting it like this
Plain Text
Settings.llm = OpenAI(temperature=0.1, model=config.get("llm_model"))
, it should set it for everything everywhere, right?
Add a reply
Sign up and join the conversation on Discord