Find answers from the community

Updated last year

Saving

Is there any way to save a llama_index.indices.vector_store.base.VectorStoreIndex ?
I don't seem to find any method for it... I could always try pickle or joblib... but not sure it will workout.
L
j
5 comments
For local saving, Normally you save the entire storage context and load it

Plain Text
index.storage_context.persist(persist_dir="./storage")

from llama_index import StorageContext, load_index_from_storage

storage_context = StorageContext.from_defaults(persist_dir="./storage")

loaded_index = load_index_from_storage(storage_context)
thanks @Logan M I think that was the "persist_dir" that I was missing
sorry @Logan M to come back on this question... but will this store my embedings and all the info? so that I don't need to regenerate them?
It sure will πŸ‘
thanks once again πŸ™‚ u've been of great help
Add a reply
Sign up and join the conversation on Discord