Find answers from the community

Updated 2 weeks ago

Strict

@Logan M - I can see strict mode enabled in OpenAI, but the same is not enable in through AzureOpenAI, is this right?
L
d
11 comments
Yea not sure if azure supports this param yet or not (and if they do, it's probably in the very latest api version for azure)
yes, its supported in llama-index, i confirmed it
but somehow its not working with the OpenAI Model
seems like the schema generated is incorrect
Certain schemas are not valid in strict mode
That's my understanding anyways, I tried a lot to fix this
On a tangent note, actually there are two types of structured data extraction, Function Calling and through response_format , but i suppose in llama-index we are doing Structure Output Extraction i.e. llm.as_structured() using only Function Calling
I would also like to know more about this, as I am planning to use entity extraction on a larger scale with around 5 to 60 entities being extracted so, i would need some guarantee of the output format.
Function calling is the same as response format functionally, it's just a different ux
I'm afraid I don't know much more haha

Set strict=True and give it a shot
Add a reply
Sign up and join the conversation on Discord