Find answers from the community

Updated 3 months ago

**Simple local vector store index that

Simple local vector store index that supports hybrid search?

Alright I've got a weird problem trying to wrap up a llama-pack:

I NEED a vector store index object that has text and vector representations of its data. How can I build a simple vector store with a small corpus of local data (it really doesn't have to be much, just enough to answer 1-2 questions) that supports HYBRID SEARCH. Most of the guides online I've seen build from textnodes directly or documents directly, none of these work because the index provided in those examples does NOT support hybrid queries. I can't use free or small instances of services like Pinecone either because these are just test fixtures and I can't expect the llama-index repo to have my credentials (nor is it best practice)

ValueError: Invalid query mode: hybrid
L
n
13 comments
gotta mock mock mock lol
Or implement something that overrides SimpleVectorStore and accepts hybrid search
This is probably easier I think
option 2 might be better, since its just one class method to override
I've mocked my LLM and Embedding model, this is the last thing and then I'm so close lol
This has turned out to be more difficult than expected, but still the best route. I am just writing a monkeypatch into SimpleVectorStore to accept hybrid queries. To do this, I am not actually implementing the functionality but just returning static results when a hybrid query comes in. I have to return it in the class VectorStoreQueryResult and there is scant documentation around that it seems but I am running tests and have figured out what I need to do. Onward to tomorrow πŸ˜„
yea thats mostly an internal type lol
Yeah I figured, but if I can return that object I can mock a hybrid result so I'm going with that
Its not too complicated, you guys using pydantic makes it somewhat easy to guess along the way
Actually, I found an even easier way. More tomorrow because I'm sleepy. As always - thanks @Logan M
All is passing - now to migrate into llama-index.
πŸŽ‰πŸŽ‰πŸŽ‰πŸŽ‰πŸŽ‰πŸŽ‰πŸŽ‰πŸŽ‰πŸŽ‰πŸŽ‰πŸŽ‰πŸŽ‰πŸŽ‰πŸŽ‰πŸŽ‰
Attachment
image.png
lets gooooo πŸŽ‰
Add a reply
Sign up and join the conversation on Discord