Find answers from the community

Updated last year

GitHub - openai/openai-python: The offic...

At a glance

The community members are facing an issue with the openai library, where the openai.ChatCompletion method is no longer supported in version 1.0.0 and above. A community member suggests using the new openai.chat.completions.create() method instead. Another community member is using the query_engine from the llama-index library and is looking for a solution to fix the issue. The community members discuss the versions of the openai and llama-index libraries, and one community member suggests trying a new environment with the latest version of llama-index, which seems to have fixed the issue.

Useful resources
Hello, I am facing this error
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.

How to fix?
o
W
11 comments
Hello, @Logan M , @WhiteFang_Jr , Any advice ?
Yeah I think they moved the method and it is called now like this in their new library:
Plain Text
completion = openai.chat.completions.create(
    model="gpt-4",
    messages=[
        {
            "role": "user",
            "content": "How do I output all files in a directory using Python?",
        },
    ],
)
print(completion.choices[0].message.content)
I am using query_engine from index via llama-index.

In llama-index, any solution to fix this ?
It was working fine, but I am facing that issue today. strange, @WhiteFang_Jr
Is your openAI updated? ALso what version of LlamaIndex you are using? seems like older version of LlamaIndex I guess
0.9.11.post1

I am using this version of llama-index
openai is 1.3.7
Try spinning up a new env and check with latest version, I have 0.9.30 and it is working fine
@WhiteFang_Jr , I have fixed issue now
Awesome! Did you create a new env or did something else?
Add a reply
Sign up and join the conversation on Discord