Find answers from the community

Updated 3 months ago

Hi all I have a MongoDB with a huge set

Hi all, I have a MongoDB with a huge set of prestored data (~20G. with text and metadata in each document). I'm using SimpleMongoReader and trying to create index. But when it needs to do reader.load_data, I ran out of RAM. Is there any workaround that I don't need to firstly load everything into memory to create index?
L
1 comment
Hmm, I think the only solution here is not query mongo so that you aren't pulling in all 20GBs of data at once

Although creating an index over 20GB of data will use a LOT of compute/memory. I would definitely use a production vector db for this (qdrant, weaviate, pinecone)
Add a reply
Sign up and join the conversation on Discord