Find answers from the community

Updated 4 months ago

Tuple

At a glance

The post asks for help, and the comments discuss a problem with the declaration of an LLM object. Community members suggest removing a comma after the LLM declaration, as it may be causing the object to be converted to a tuple. They also discuss using the GPT-4 model instead of the Mistral model, and confirm that removing the comma resolves the issue.

somebuddy help me pls
Attachment
image.png
W
D
6 comments
You have comma after llm declaration. That is converting the llm object to a tuple
You sure? Bc when i set the llm to gpt-4 it works just doesnt work with mistral
Yeah try once removing the comma on line 3

llm = MistralAI(model="mistral-medium")
Ohhh that's what you mean
Yeah πŸ˜…
Did it work?
Add a reply
Sign up and join the conversation on Discord