Find answers from the community

Updated 4 months ago

llama_parse/examples/demo_advanced.ipynb...

At a glance

The community members are discussing how to get llama_parse working with Azure OpenAI. One community member provides a guide on how to set up Azure OpenAI with llama-index, which can be used with llama_parse. Another community member confirms that this solution worked for them.

However, one community member is experiencing an issue with llama_parse on their Windows PC, getting a timeout exception when trying to load a PDF file. The community members suggest that the PDF file may have been corrupted or the download link may have been incorrect. They also mention that there may have been some issues with the llama-cloud service, which has since been resolved.

There is no explicitly marked answer in the comments, but the community members provide helpful suggestions and guidance to resolve the issues with using llama_parse with Azure OpenAI.

Useful resources
is there a way to get llama parse working with Azure OpenAI? the demo only has vanilla version:

https://github.com/run-llama/llama_parse/blob/main/examples/demo_advanced.ipynb
L
t
8 comments
you can set it in the global settings like that notebook does, or pass in the llm/embed model as kwargs to where they are needed
thanks logan! that worked.
funny enough, on my windows pc, its failing on this line:

Plain Text
documents = LlamaParse(result_type="markdown").load_data('./uber_10q_march_2022.pdf')


getting Exception: timeout....

is something down with llama cloud?
i see now.. pdf was corrupted likely cause the address moved
wget points to wrong place
llama-cloud may have also been down, there was some issue with rate limits on the backend (should be all good now though)
thanks, ill ask another question in the general chat, maybe it helps someone too
Add a reply
Sign up and join the conversation on Discord