Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated 2 months ago
0
Follow
Response
Response
Inactive
0
Follow
W
WillFuks
last year
ยท
Hi guys, do you know if it'd be possible to use the option
response_format
from the openai client(
https://github.com/openai/openai-python/blob/main/src/openai/resources/chat/completions.py#L69
) in llama_index?
For what i understood in the code there's no kwargs being sent to the client but i'm not sure if there's some other way around
L
1 comment
Share
Open in Discord
L
Logan M
last year
Only really works right now if you use the LLM directly
https://docs.llamaindex.ai/en/stable/examples/llm/openai_json_vs_function_calling.html#data-extraction-with-json-mode
Add a reply
Sign up and join the conversation on Discord
Join on Discord