Find answers from the community

Updated 2 years ago

Chatgpt

At a glance

A community member tried to replace the text-davinci-003 model with gpt-3.5-turbo-0301, but it didn't work. They asked if it's possible to use the new model based on OpenAI's release and if it will be supported.

In the comments, other community members responded that the new model is supported, but may not work as well as davinci for now. They suggested updating the llama-index installation and provided a demo on how to use chat-gpt. They also mentioned that the openai and langchain libraries might need to be updated if there are still issues.

One community member confirmed that gpt-3.5-turbo can be used with langchain's api, but the OpenAIChat class should be used instead of OpenAI. They provided a link to a colab notebook demonstrating this.

Finally, a community member confirmed that the solution worked for them.

Useful resources
I tried to replace model text-davinci-003 with gpt-3.5-turbo-0301. It doesn't work. Is it possible to use the new model base on what OpenAI released? And are we going to support it? @jerryjliu0
1
L
j
l
5 comments
It's supported πŸ’ͺ but may not work as well as davinci for now

Make sure you update your llama-index installation (lots of new releases lately)

pip install --upgrade llama-index

There's also a demo here with how to use chat-gpt: https://github.com/jerryjliu/gpt_index/blob/main/examples/vector_indices/SimpleIndexDemo-ChatGPT.ipynb
You might also have to update your openai and langchain libraries specifically if you still have issues
yeah you can still use gpt-3.5-turbo with langchain's api, but you'll have to use langchain.llms.openai OpenAIChat class, not OpenAI (take a look at https://colab.research.google.com/drive/1IJAKd1HIe-LvFRQmd3BCDDIsq6CpOwBj?usp=sharing)
cool, it works. Thanks!
Add a reply
Sign up and join the conversation on Discord