Find answers from the community

Updated 2 months ago

Strict

At a glance

The post asks if strict mode is enabled in AzureOpenAI, as it is in OpenAI. The comments suggest that strict mode may not be supported in the latest Azure API version, but is supported in llama-index. However, community members are having issues getting it to work with the OpenAI model, as the generated schema seems to be incorrect for strict mode. They discuss the different types of structured data extraction, such as function calling and response_format, and the need for a guaranteed output format when extracting a large number of entities. There is no explicitly marked answer.

Useful resources
@Logan M - I can see strict mode enabled in OpenAI, but the same is not enable in through AzureOpenAI, is this right?
L
d
11 comments
Yea not sure if azure supports this param yet or not (and if they do, it's probably in the very latest api version for azure)
yes, its supported in llama-index, i confirmed it
but somehow its not working with the OpenAI Model
seems like the schema generated is incorrect
Certain schemas are not valid in strict mode
That's my understanding anyways, I tried a lot to fix this
On a tangent note, actually there are two types of structured data extraction, Function Calling and through response_format , but i suppose in llama-index we are doing Structure Output Extraction i.e. llm.as_structured() using only Function Calling
I would also like to know more about this, as I am planning to use entity extraction on a larger scale with around 5 to 60 entities being extracted so, i would need some guarantee of the output format.
Function calling is the same as response format functionally, it's just a different ux
I'm afraid I don't know much more haha

Set strict=True and give it a shot
Add a reply
Sign up and join the conversation on Discord