Find answers from the community

s
F
Y
a
P
Updated 2 years ago

llm predictor LLMPredictor llm

llm_predictor = LLMPredictor(llm=ChatOpenAI(temperature=temperature, model_name="gpt-3.5-turbo", max_tokens=num_outputs)) new version seems prefer definition like this..
d
V
5 comments
ok I will change it. thank you I am also getting a warning. will try it out and get the debug logs
i am a newbie, so ... can't help you much πŸ˜…
please wait for big bad boys coming to rescue
haha it's fine πŸ˜„
just wanted to try out with a cheaper model first. maybe it can be good enough for our use case
Add a reply
Sign up and join the conversation on Discord