docstore.json
, vector_store.json
, etc.) and somehow get a list of all the nodes and their embeddings (call it all_nodes
) so that all_nodes
can be added to another type of index? Like this: vector_store.add(all_nodes_from_json)
where vector_store
is an OpenSearchVectorStore
?# load the index index = load_index_from_storage(...) # get the nodes and embeddings nodes = index.docstore.docs embeddings = index.vector_store._data.embedding_dict # attach the embeddings nodes_with_embeddings = [] for node_id, node in nodes.items(): node.embedding = embeddings[node_id] nodes_with_embeddings.append(node) # create a new index with the new backend vector_index = VectorStoreIndex(nodes_with_embeddings, storage_context=storage_context)