Find answers from the community

Updated 2 months ago

Python client for azure openai multimodal model gpt-4o-mini parsing

in llamaparse how do I use the python client to use my azure openai multimodal model gpt-4o-mini for parsing:
```
from llama_parse import LlamaParse
parser = LlamaParse(
result_type="markdown",
use_vendor_multimodal_model=True,
vendor_multimodal_model_name=""
vendor_multimodal_api_key=""
)
f
L
S
4 comments
@Logan M πŸ™
I don't think azure is actually supported over the api like that (quite a few variables that end up needing config, if we did add support for it)
ah dang, so the the only workaround is to use the UI playground here: (it works btw with my azure gpt-4o-mini endpoint)
Attachment
image.png
I'm double checking with the team on that
Add a reply
Sign up and join the conversation on Discord