Thanks @WhiteFang_Jr for the reply. But i am little confused.
So here is my situation:
I have, let's say, 5000 nodes to insert. For faster processing, I thought I would batch this. (Because sometimes i will be doing 20,000 nodes at a time.)
So my so I am processing 1000 nodes per batch, and batch of 5 will run parallel.
In each batch, I have the following:
pinecone_index = get_pinecone_index()
vector_store = PineconeVectorStore(
pinecone_index=pinecone_index,
namespace=f"product/{product_id}"
)
print("Adding nodes to pinecone")
vector_store.add(nodes)
above works because pinecone supports it.
but I am not sure what to do with the summary index. Can I do an update summary index parelelly?
I am using redis to presist summary index and pinecone as vector store.
Also, please note that I am using nodes, not documents.