The community members are facing an issue with the openai library, where the openai.ChatCompletion method is no longer supported in version 1.0.0 and above. A community member suggests using the new openai.chat.completions.create() method instead. Another community member is using the query_engine from the llama-index library and is looking for a solution to fix the issue. The community members discuss the versions of the openai and llama-index libraries, and one community member suggests trying a new environment with the latest version of llama-index, which seems to have fixed the issue.
Hello, I am facing this error You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.
Yeah I think they moved the method and it is called now like this in their new library:
Plain Text
completion = openai.chat.completions.create(
model="gpt-4",
messages=[
{
"role": "user",
"content": "How do I output all files in a directory using Python?",
},
],
)
print(completion.choices[0].message.content)