The community member is inquiring about the possibility of running LlamaParse locally without involving LlamaCloud, and whether there is a sample notebook available. The comments indicate that LlamaParse is currently cloud-only, but the company is working on options for on-premises deployment for enterprises. The ETA for this feature is several months, and the company is only working with close enterprise partners for now. The community member is advised to reach out to the company's contact form if they are interested in this feature.
Is it possible to run LlamaParse locally without involving LlamaCloud? I see questions posted on the youTube tutorial on if it can be on Prem but not answered. If so, is there a sample notebook?
Do you have any ETA? My company has very strict rules on data when LLM is involved. On prem would be the only possibility for adoption. I think similar policies exist for many potential enterprise users.
Not open-source. And ETA is several months, only working with close enterprise partners for now. I recommend reaching out on the contact form if you have interest https://www.llamaindex.ai/contact