Find answers from the community

Updated 8 months ago

i'm trying to integrate with AstraDB. i

i'm trying to integrate with AstraDB. i have executed generate.py and can observe the embeddings in the AstraDB.

when i try to load the storagecontext as index, it throws an error. going thru the LlamaIndex AstraDB documentation didnt mention anything about index.
Plain Text
    astra_db_store = AstraDBVectorStore(....)
    storage_context = StorageContext.from_defaults(vector_store=astra_db_store)
    index = load_index_from_storage(storage_context)
    index.as_chat_engine(chat_mode=ChatMode.CONDENSE_PLUS_CONTEXT)
    .....


Plain Text
File "...lib/python3.11/site-packages/llama_index/core/indices/loading.py", line 36, in load_index_from_storage
    raise ValueError(
ValueError: No index in storage context, check if you specified the right persist_dir.


am i missing something?
W
x
8 comments
in the tutorial they use the index returned from the VectorStoreIndex.from_documents(). for my case i used load_index_from_storage as i had prepared the embeddings in the earlier step
try doing it this way once:
Plain Text
index = VectorStoreIndex.from_documents([], storage_context=storage_context)
yes, now it works. i have changed my code as below

Plain Text
    astra_db_store = AstraDBVectorStore(....)
    storage_context = StorageContext.from_defaults(vector_store=astra_db_store)
    index = VectorStoreIndex.from_documents([], storage_context=storage_context)
    index.as_chat_engine(chat_mode=ChatMode.CONDENSE_PLUS_CONTEXT)
updated my code to VectorStoreIndex.from_vector_store(), and still works
now it makes the APIs a little inconsistent, generally we were taught to use load_index_from_storage
Add a reply
Sign up and join the conversation on Discord