Find answers from the community

Updated 4 months ago

Hi, is it possible to use `insert_nodes`

At a glance

The community member asks if it is possible to use insert_nodes in VectorStoreIndex asynchronously, as they believe that async is only possible using build_index_from_nodes. Another community member confirms that this is not currently possible, but suggests using an ingestion pipeline with a third-party vector database like Pinecone to achieve this asynchronously. The community members discuss using an ingestion pipeline with OpenAIEmbedding and IngestionPipeline from the llama_index library to accomplish this task.

Hi, is it possible to use insert_nodes in VectorStoreIndex async it seems async is only possible using build_index_from_nodes is that correct?
L
j
4 comments
yea good call out, not possible right now

If you are using a 3rd party vector db, you can use an ingestion pipeline to do this async
yeah, I'm using Pinecone, any tip?
I want to keep the nice text nodes abstraction
you can do something like this. If you haven't already chunked your nodes, you can also add a splitter before the embedding model in the pipeline

Plain Text
from llama_index.pipeline import IngestionPipeline

pipeline = IngestionPipeline(transformations=[OpenAIEmbedding()], vector_store=vector_store)

await pipeline.arun(nodes=nodes)
Add a reply
Sign up and join the conversation on Discord