Find answers from the community

Updated 3 months ago

Hi everyone,

Hi everyone,
I’m trying to implement something similar to this example: https://docs.llamaindex.ai/en/stable/examples/query_engine/multi_doc_auto_retrieval/multi_doc_auto_retrieval/

Where I would retrieve an IndexNode from one index then, because its an IndexNode, recursively retrieve the referenced document in the IndexNode using a retriever stored in the obj.

However, LlamaIndex doesn’t persist the obj for later use, as its saved as null. Is there a way to store the obj or is there a way around this?
L
s
7 comments
What is the obj pointing to? If its serialiable, it will get saved, otherwise you have to provide it again at load time
The obj is a retriever pointing to a vector database
yea thats not going to be serializable 😅

You can do this when re-connecting to your vector store

Plain Text
from llama_index.core.schema import IndexNode

vector_obj = IndexNode(
    index_id="vector", obj=vector_retriever, text="Vector Retriever"
)

index = VectorStoreIndex.from_vector_store(vector_store, objects=[vector_obj], ...)
I’m pretty much doing it the same way as the example i sent, the summary vector store holds IndexNodes where the ‘obj’ is a retriever with a filter to another vector store.

I’m assuming this wouldn’t work when reconnecting?
When reconnecting you have to pass in the objects again, like the above
since the index node objects aren't serializable
Gotcha, thank you!
Add a reply
Sign up and join the conversation on Discord