Find answers from the community

Updated 11 months ago

Hi guys, what's the simplest way to

At a glance

The community members are discussing how to check the active large language model (LLM) being used for a query. One community member asks the simplest way to do this, and another confirms they want to check which LLM is generating the response. The community members note that the default is GPT-3.5, but they sometimes change the model and would like to double-check which one is being used. They also discuss setting the LLM using Settings.llm = OpenAI(temperature=0.1, model=config.get("llm_model")), and confirm that this should set the LLM for everything. However, there is no explicitly marked answer in the comments.

Hi guys, what's the simplest way to check the active llm model for a query?
W
m
5 comments
You want to check what llm is being used to generate response?
I know, the default is gpt3.5, but sometimes I'm changing it and would be nice to double-check
And if I'm setting it like this
Plain Text
Settings.llm = OpenAI(temperature=0.1, model=config.get("llm_model"))
, it should set it for everything everywhere, right?
Add a reply
Sign up and join the conversation on Discord