Find answers from the community

Updated 3 months ago

Milvus

Hi sorry for the noob question. But I have a pipeline that populates a milvus database with embeddings and content already.

I have searched for a few hours but can't seem to find out how I can use llamaindex to use that existing db and existing embeddings as part of a rag workflow.

I am using llama index 0.11.15. I'm creating a milvusvectorstore and whenever I try to create an index from vector store and pass it my milvus object it seems to complain about an openai api key not being present. Which I don't quite understand because I already have all my embeddings pre generated. I may be lost in the sauce here. Any advice? Anyone do this before? Much appreciated, TIA πŸ™‚
W
D
10 comments
Hi there,
You can connect to an existing vector store this way:

Plain Text
# Create an index over the documents
from llama_index.core import VectorStoreIndex, StorageContext
from llama_index.vector_stores.milvus import MilvusVectorStore


vector_store = MilvusVectorStore(
    uri="./milvus_demo.db", dim=1536, overwrite=True
)

index = VectorStoreIndex.from_vector_store(
    vector_store=vector_store
)



Just keep in mind that you use the same embedding model that you used to generate embeddings
So I tried this except it complains about not having an openai key. I attached an image of the error doing basically that exactly except no overwrite
You have your llm and embed_model in place?
I tried setting the embedding model using Settings.embed_model = llm-embedding-object ( data bricks llm serving endpoint)
OpenAI key are the default ones if llm and embed_model are not found
It's give mes an assertion error when I try to set the embed_model it says it has been disabled
Oh wait. Maybe I see the error. Is the embedding model a different kind of object to the llm? I just hooked up the llm to two different serving endpoints, one of which happened to be an embedding model
Yes embedding model object is diff from llm object
Gotcha. Thank you so much!
Add a reply
Sign up and join the conversation on Discord