Find answers from the community

Updated 3 months ago

Hey team - Is Llamaindex safe for

Hey team - Is Llamaindex safe for multiprocessing for both ingestion and querying? I'm interested in using it in a scaled application to replace my custom RAG setup, but haven't been able to find a recent answer to this question online or in the discord history.
L
h
6 comments
eh, it depends a lot on which llms, vector stores, etc. you are using

Always safer to use async and rely on concurrency rather than parallelism
Thanks @Logan M so we don't know any setup that works with a highly scalable system that leverages both async and multiple workers?
You can have multiple workers if you want, but just need to be carful about shared resources

Imo each work would instansiate from scratch the things it needs to use
Thanks @Logan M Hmm. Putting it another way, is the system designed to work safely and effectively with multiprocessing-safe storage systems? In our use case, we need multiple workers for both ingestion and querying. When ingesting can multiple workers submit to a shared storage mechanism that's multiprocessing safe without crashing? And likewise when querying can multiple workers query shared storage mechanisms that are multiprocessing safe without crashing?
it depends on what storage/vector store you are using, and if it handles parallel inserts/queries
Thats out of llama-indexes hands
Add a reply
Sign up and join the conversation on Discord