Find answers from the community

Updated 5 months ago

Python client for azure openai multimodal model gpt-4o-mini parsing

At a glance

The community member is trying to use the Python client of the llamaparse library to interact with their Azure OpenAI multimodal model named "gpt-4o-mini". However, other community members suggest that Azure is not currently supported by the llamaparse API, and the only workaround is to use the UI playground. The community members are still investigating this issue and checking with the team.

in llamaparse how do I use the python client to use my azure openai multimodal model gpt-4o-mini for parsing:
```
from llama_parse import LlamaParse
parser = LlamaParse(
result_type="markdown",
use_vendor_multimodal_model=True,
vendor_multimodal_model_name=""
vendor_multimodal_api_key=""
)
f
L
S
4 comments
@Logan M πŸ™
I don't think azure is actually supported over the api like that (quite a few variables that end up needing config, if we did add support for it)
ah dang, so the the only workaround is to use the UI playground here: (it works btw with my azure gpt-4o-mini endpoint)
Attachment
image.png
I'm double checking with the team on that
Add a reply
Sign up and join the conversation on Discord