Find answers from the community

Updated 2 years ago

Is there any reason why the model `gpt 3

At a glance

A community member added the "gpt-3.5-turbo-16k" model to the list of supported models in their code, and it has been working fine. However, another community member clarified that this model was not supported at the time the list was originally written, but the next release will remove this list entirely in favor of a different way of checking support.

Is there any reason why the model gpt-3.5-turbo-16k is not supported by llama-index? In openai_agent.py I just added the model to the list of supported names shown below, and for days now its been working perfectly fine, 0 issues.

Plain Text
SUPPORTED_MODEL_NAMES = [
    "gpt-3.5-turbo-0613",
    "gpt-4-0613",
    "gpt-3.5-turbo-16k" # <-- Added this
]
L
D
2 comments
When we original wrote that a few weeks ago, 16K was not supported

The next release removes this list entirely, in favour of a different way of checking support
Oh ok, thanks for the clarification
Add a reply
Sign up and join the conversation on Discord