Find answers from the community

Updated 2 months ago

Faiss index demo

At a glance

The community member is having an issue with the load_index_from_storage method from the LlamaIndex documentation. When using this method, they are getting a BaseIndex object instead of a VectorStoreIndex which contains the embeddings. This is causing issues with their code, as they are getting an API connection error when trying to use the object.

The community members discuss possible solutions, such as casting the BaseIndex to a VectorStoreIndex, but this is causing the API connection error. They also provide code examples of how they are creating and loading the index.

One community member suggests that the load_index_from_storage method is returning a BaseIndex because it can load any index type, and the typing is done this way in Python. They also suggest that the API connection error might be due to a missing embedding model or API key, rather than the typing issue.

In the end, the community member resolves the issue by adding the missing line vector_store=FaissVectorStore.from_persist_dir(self.persist_dir) to their code.

Useful resources
Hey having issue with this documentation:
https://docs.llamaindex.ai/en/v0.10.34/examples/vector_stores/FaissIndexDemo/

When using the load_index_from_storage method I get a BaseIndex object instead of a VectoreStoreIndex which contains the embedding etc so it does not seem to work properly. Any suggestions? BEsides saving to pickle file.
L
a
9 comments
Its definitely returning a VectorStoreIndex, but in your IDE, its probably typed as a base index
All the methods are there
You can always cast it if needed to help with your IDE

Plain Text
index: VectorStoreIndex = load_index_from_storage(...)


Or even
Plain Text
index = load_index_from_storage(...)
if not isinstance(index, VectorStoreIndex):
  raise ValueError("unexpected index type loaded")
wow cool thanks for replying.
But the function itself is returning a BaseIndex object:

def load_index_from_storage( storage_context: StorageContext, index_id: Optional[str] = None, **kwargs: Any, ) -> BaseIndex:

And the problem I'm getting with useing the object is that then I get an API Connection error from the embedding model. Which is not happening when I simply create the index directly.
And casting with like so is what's causing the API Connection error, I assume that it is because the BaseIndex does not have an embedding property.

return cast(VectorStoreIndex, load_index_from_storage(storage_context=storage_context))
My code for reference:

def create_index(self) -> VectorStoreIndex: vector_store = self._get_vector_store() storage_context = StorageContext.from_defaults(vector_store=vector_store) documents = self.reader.load_data() logging.info("Loaded %d documents into vector store", len(documents)) vecstore_index = VectorStoreIndex.from_documents(documents, storage_context, embed_model=self.embed_model) if self.persist_dir: storage_context.persist(self.persist_dir) return vecstore_index def load_index(self) -> VectorStoreIndex: if not Path(self.persist_dir).exists(): raise FileNotFoundError(f"Persist dir [{self.persist_dir}] does not exist") logging.info("Loading index from dir: %s", self.persist_dir) storage_context = StorageContext.from_defaults(persist_dir=self.persist_dir, vector_store=self._get_vector_store()) return cast(VectorStoreIndex, load_index_from_storage(storage_context=storage_context))
The BaseIndex is the base index class. Since load_index_from_storage can load any index type, it has to be typed this way. This is just how typing in python works imo

I don't think API Connection error is caused by the typing here (typing in python is not strict anyways)

More likely, you have a custom embedding model and you are not passing it in? Or you don't have your api key set for openai?

load_index_from_storage(storage_context, embed_model=embed_model)
Hey sorry for the delay, the mebedding works when creating the index, the only problem i have is with the load method.
I added the embedding model as you showed, but it does not behave in the same way.
Was also missing this line: vector_store=FaissVectorStore.from_persist_dir(self.persist_dir)
thank you just misunderstanding on my part
Add a reply
Sign up and join the conversation on Discord