Find answers from the community

Updated 7 months ago

@Logan M can you please with this. I am

@Logan M can you please with this. I am trying to embed the images and text but this snippet is not working as I copied it from notebook provoded by llama index
L
A
N
16 comments
that code looks fine to me, but sharing the error might be helpful
This is the code that I am using at the moment


from llama_index.core.indices import MultiModalVectorStoreIndex
from llama_index.vector_stores.qdrant import QdrantVectorStore
from llama_index.core import SimpleDirectoryReader, StorageContext
import qdrant_client
from llama_index.core import SimpleDirectoryReader
os.environ["OPENAI_API_KEY"] = "...."


Create a local Qdrant vector store

client = qdrant_client.QdrantClient(path="qdrant_mm_db")


text_store = QdrantVectorStore(
client=client, collection_name="text_collection"
)
image_store = QdrantVectorStore(
client=client, collection_name="image_collection"
)
storage_context = StorageContext.from_defaults(
vector_store=text_store, image_store=image_store
)

Create the MultiModal index

documents = SimpleDirectoryReader("./mixed_wiki/").load_data()
index = MultiModalVectorStoreIndex.from_documents(
documents,
storage_context=storage_context,
)

Save it

index.storage_context.persist(persist_dir="./storage")

Load it

from llama_index.core import load_index_from_storage

storage_context = StorageContext.from_defaults(
vector_store=text_store, persist_dir="./storage"
)
index = load_index_from_storage(storage_context, image_store=image_store)
How did you create client ? Seems like its pointing to an openai client instead of a qdrant client
oh wait its in the snippet above, but commented out?
maybe double check that
I created the client = qdrant_client.QdrantClient(path="qdrant_mm_db")
Once it is executed it created a folder along with a json file inside.
so it worked. But once it is created i commented out this line of code.
I am just trying to extablish a simple model to put few charts and graphs along with pdfs so that I can also query the chats and graphs.
I tried all the notebooks but none of them is working. may be due to the updates made on llama-index library
You need the client, probably not a good idea to comment it out
You should always have QdrantVectorStore("collection_name", client=QdrantClient(...), ...)
Otherwise without the client, it cant connect
And in this case, it seems like you have another variable called client that you are passing in that points to an openai object
let me check
@Logan M it worked very well. I think the problem was with an other openAi client that was a conflict here. Now only the last thing that i need your help is this. index.as_retriver is working well but as_query_engine I get error. But it is not telling what could be the error here
Attachment
image.png
Hi @Ash_ can we connect on personal chat i send you request as well as i m also working on this
Add a reply
Sign up and join the conversation on Discord