I'm watching the Discover Llama Index series on YouTube and I have some questions regarding the Data Agent
https://youtu.be/GkIEEdIErm8?feature=shared&t=269At 4:29, the presenter says Wikepedia data is queried and then dumped into a vector store, and then the 2nd tool queries the vector store.
- From the screenshot, I cannot tell where he specified the Vector Store. I would imagine there's like a Pinecone or something specified somewhere. Is this just saving as a Vector Index in memory (not using Vector DB) but it allows follow up questions.
- In each API call, you must embed what you search in wikipedia, and then dump it into vector store. Is this kind of practice normal because it sounds really slow/heavy but something I see a usecase for. For example when I want to narrow down a diary to a specific date range with SQL and then do some refined vector base search within it.