Is it possible to run LlamaParse locally without involving LlamaCloud? I see questions posted on the youTube tutorial on if it can be on Prem but not answered. If so, is there a sample notebook?
Do you have any ETA? My company has very strict rules on data when LLM is involved. On prem would be the only possibility for adoption. I think similar policies exist for many potential enterprise users.
Not open-source. And ETA is several months, only working with close enterprise partners for now. I recommend reaching out on the contact form if you have interest https://www.llamaindex.ai/contact