Find answers from the community

Home
Members
toasty_mctoastface
t
toasty_mctoastface
Offline, last seen 3 months ago
Joined September 25, 2024
Hi all, trying to follow along some of the examples. In the section of Local Cache Management, regarding saving: https://docs.llamaindex.ai/en/stable/module_guides/loading/ingestion_pipeline/root.html#local-cache-management

I am getting an error, when i try to load back in:

new_pipeline.load("./pipeline_storage")

FileNotFoundError: [Errno 2] No such file or directory: '/Users/toast/Developer/LLM/guidelines/pipeline_storage/docstore.json'

Now, this is confusing me on a couple of front, in the earlier lines:

Plain Text
# save
pipeline.persist("./pipeline_storage")
# load and restore state
new_pipeline = IngestionPipeline(transformations=transformations)


It has save the file, but as ./pipeline_storage/llama_cache

so why is it now looking for docstore.json?
3 comments
L
t