The post asks when the llamaparse library will be available for JavaScript/TypeScript. Community members discuss using an API to make network requests and send parsing instructions through the API. They mention that the documentation is limited, but a community member provides an example of how to include parsing instructions in API requests. The community members also discuss the availability of parsed data on the llamacloud server, with one member stating that the data is available for at least 2 days.
I could not find docs related to parsing documents with instructions. Is there way to parse documents with custom instruction through the use of API requests?
I also noticed that even while using default create-llama with py-fastAPI and llama-parse enabled, it is hitting the api for parsing. Since there we can put custom instruction so I am very much eager to know if I can do that using raw requests on my own.
Thank you very much for this. Now I can implement it in any language. The problem I was facing was to send parsing instruction through API which was not mentioned in docs but now with this example it is very clear.
When I post files to llama cloud to parse, I have to check its status with a API request and later hit new API request to fetch the parsed data. May I know for how long will the parsed data will be available in the llamacloud server to fetch for us?