Find answers from the community

Updated 3 months ago

Hi all, trying to follow along some of

Hi all, trying to follow along some of the examples. In the section of Local Cache Management, regarding saving: https://docs.llamaindex.ai/en/stable/module_guides/loading/ingestion_pipeline/root.html#local-cache-management

I am getting an error, when i try to load back in:

new_pipeline.load("./pipeline_storage")

FileNotFoundError: [Errno 2] No such file or directory: '/Users/toast/Developer/LLM/guidelines/pipeline_storage/docstore.json'

Now, this is confusing me on a couple of front, in the earlier lines:

Plain Text
# save
pipeline.persist("./pipeline_storage")
# load and restore state
new_pipeline = IngestionPipeline(transformations=transformations)


It has save the file, but as ./pipeline_storage/llama_cache

so why is it now looking for docstore.json?
L
t
3 comments
theres a cache and docstore. Seems like you didn't attach a docstore though, but it's still looking for it. Seems like a bug
ah thanks for the reply. I actually had a little chat with kapa.ai about it over here: https://discord.com/channels/1059199217496772688/1075541063072239676/1199677830955540511 and i thought we were making great headway around me creating the docstore.json, but i think it got a wee bit confused, then it tried to get me to use mongodb πŸ™‚
As a workaround, attach a docstore to the pipline

Plain Text
from llama_index.storage.docstore import SimpleDocumentStore

pipeline = IngestionPipeline(..., docstore=SimpleDocumentStore())
....
Add a reply
Sign up and join the conversation on Discord