Find answers from the community

Updated 3 months ago

Response

Hi guys, do you know if it'd be possible to use the option response_format from the openai client(https://github.com/openai/openai-python/blob/main/src/openai/resources/chat/completions.py#L69) in llama_index?

For what i understood in the code there's no kwargs being sent to the client but i'm not sure if there's some other way around
Add a reply
Sign up and join the conversation on Discord