Hi all, trying to follow along some of the examples. In the section of Local Cache Management, regarding saving:
https://docs.llamaindex.ai/en/stable/module_guides/loading/ingestion_pipeline/root.html#local-cache-managementI am getting an error, when i try to load back in:
new_pipeline.load("./pipeline_storage")
FileNotFoundError: [Errno 2] No such file or directory: '/Users/toast/Developer/LLM/guidelines/pipeline_storage/docstore.json'
Now, this is confusing me on a couple of front, in the earlier lines:
# save
pipeline.persist("./pipeline_storage")
# load and restore state
new_pipeline = IngestionPipeline(transformations=transformations)
It has save the file, but as
./pipeline_storage/llama_cache
so why is it now looking for
docstore.json
?