Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated 8 months ago
0
Follow
Hi guys, what's the simplest way to
Hi guys, what's the simplest way to
Inactive
0
Follow
m
marcellk
8 months ago
ยท
Hi guys, what's the simplest way to check the active llm model for a query?
W
m
5 comments
Share
Open in Discord
W
WhiteFang_Jr
8 months ago
You want to check what llm is being used to generate response?
m
marcellk
8 months ago
exactly
m
marcellk
8 months ago
I know, the default is gpt3.5, but sometimes I'm changing it and would be nice to double-check
m
marcellk
8 months ago
And if I'm setting it like this
Plain Text
Copy
Settings.llm = OpenAI(temperature=0.1, model=config.get("llm_model"))
, it should set it for everything everywhere, right?
W
WhiteFang_Jr
8 months ago
yes
Add a reply
Sign up and join the conversation on Discord
Join on Discord