Find answers from the community

Updated 2 years ago

hey getting error when trying to use gpt

At a glance
hey getting error, when trying to use gpt-4-0613 with
LLMPredictor(ChatOpenAI(model = 'gpt-4-0613'))
Plain Text
Error -  Unknown model: gpt-4-0613. Please provide a valid OpenAI model name.Known models are: gpt-4, gpt-4-0314, gpt-4-32k, gpt-4-32k-0314, gpt-3.5-turbo, gpt-3.5-turbo-0301, text-ada-001, ada, text-babbage-001, babbage, text-curie-001, curie, davinci, text-davinci-003, text-davinci-002, code-davinci-002, code-davinci-001, code-cushman-002, code-cushman-001.

please help
cc: @Logan M @ravitheja
L
S
5 comments
Perhaps upgrade your langchain version?
same issue with langchain 0.0.201 😒
oh! I think the correct kwarg is model_name

Plain Text
LLMPredictor(llm=ChatOpenAI(temperature=0, model_name="gpt-3.5-turbo-0613"))  # or gpt-4-0613
same error with latest langchain 0.0.202

Plain Text
    def get_llm_predictor(self, user_id):
        # LLM
        self.llm = ChatOpenAI(
            model_name=self.model_name,
            temperature=self.temperature,
            model_kwargs={
                "frequency_penalty": self.frequency_penalty,
                "top_p": self.top_p,
                "headers": conf.PORTKEY_HEADERS,
                "user": user_id,
            },
            max_tokens=self.max_tokens,
        )

        # LLM Predictor
        return LLMPredictor(llm=self.llm)
 

😒
fixed with langchain 0.0.204 πŸ’ͺ
Add a reply
Sign up and join the conversation on Discord