Find answers from the community

Updated 5 months ago

just very general, but what are actual

just very general, but what are actual con's of using llama index instead of using vector db's directly, like qdrant, milvus or whatever? I had the feeling that it's easier to not use llama index, given some missing support for example with filtering...
W
n
L
7 comments
LlamaIndex is not an alternate to vector DBs.
LlamaIndex allows you to use different vector DBs
yes i know, but why would I need that? i dont really see a use case for that
Vector DBs comes in action when you have lots of files ( for ex: 1000 files )
Then in such case loading index or even persisting into local folder takes a lot of time.
Also having 1000 files embeddings in memory is also not a good way to work
In addition to this, some vector DBs provides different ways for finding related nodes and different filtering methods. If you have small amount of files then you dont have to use vector DB
especially for a large amount of files and embeddings, I would rather use a VectorDB directly, I currently have about 150k chunks, and using the filtering, search API directly from the vectorDB just gave me way more freedom, I felt so restricted with llama index
I mean, llama-index is providing some unified interface across several intergaritons (llms, embeddings, vector dbs). One advantage is if you ever switch, the interfaces all still work together

LlamaIndex provides way more than just data base connectors -- llms, embeddings, response synethsizers, different ways to ingest data, postprocessing retrieved results, plugging stuff into agents, etc. I think thats where a lot of the appeal comes from

Its an open-source repo, so if something like filtering was missing for the repo, contributions are totally welcome πŸ™‚ (although these days, most popular dbs have extensive filtering support)
Add a reply
Sign up and join the conversation on Discord