The post asks if strict mode is enabled in AzureOpenAI, as it is in OpenAI. The comments suggest that strict mode may not be supported in the latest Azure API version, but is supported in llama-index. However, community members are having issues getting it to work with the OpenAI model, as the generated schema seems to be incorrect for strict mode. They discuss the different types of structured data extraction, such as function calling and response_format, and the need for a guaranteed output format when extracting a large number of entities. There is no explicitly marked answer.
On a tangent note, actually there are two types of structured data extraction, Function Calling and through response_format , but i suppose in llama-index we are doing Structure Output Extraction i.e. llm.as_structured() using only Function Calling
I would also like to know more about this, as I am planning to use entity extraction on a larger scale with around 5 to 60 entities being extracted so, i would need some guarantee of the output format.