Find answers from the community

Updated 3 months ago

Need help: Goal is to query Milvus

Need help: Goal is to query Milvus colection with specific doc_ids,
meaning user enters a query and instead query whole collection i want to be queried to some specific doc ids

from ..config import settings
from llama_index.embeddings.openai import OpenAIEmbedding
from llama_index.core import Settings
from llama_index.vector_stores.milvus import MilvusVectorStore
from llama_index.llms.openai import OpenAI
from llama_index.core.vector_stores.types import VectorStoreQuery

Settings.embed_model = OpenAIEmbedding(model=settings.EMBEDDING_MODEL ,api_key=settings.OPENAI_KEY)
Settings.llm = OpenAI(temperature=0.1,api_key= settings.OPENAI_KEY)
vector_store = MilvusVectorStore(
collection_name=f"testing",
uri=settings.MILVUS_URI,
overwrite=False,
token=settings.MILVUS_TOKEN,
similarity_metric="L2",
dim=1536,
)


query = VectorStoreQuery(doc_ids=[ "/notebooks/notebooks/data/output/131/emails/2018_08_01_12_27_05_513065_David_Yeagy_140.txt"])
vector_store.query(query)
L
B
5 comments
I would just put the filename in the metadata and use metadata filtering
okay okay, any reference docs?
We need better docs for this, but heres an example with pinecone
https://docs.llamaindex.ai/en/stable/examples/vector_stores/pinecone_metadata_filter/?h=metadata+fil

Looks like milvus only supports exact match filters though (or at least, thats what is implemented)
is there metadata filter for milvus?
can we do this chat engine
Add a reply
Sign up and join the conversation on Discord