Find answers from the community

R
Ray
Offline, last seen 3 months ago
Joined September 25, 2024
Hi all, I have a default storage. I want to delete some nodes so that they cannot be queried in subsequent queries. I tried the following code, but it doesn't work:

Plain Text
from llama_index import (
    StorageContext, 
    load_index_from_storage, 
)

path = r"C:\Users\..."
storage_context = StorageContext.from_defaults(persist_dir=path)
index = load_index_from_storage(storage_context, use_async=True)

index.delete_ref_doc( hash_id, delete_from_docstore=True)
myquery = index.as_query_engine())
response = myquery.query(question)

What step am I missing?
2 comments
R
W
Hi, I use the following code to get the two most relevant nodes. I can also get the metadata information. But how do I get the embedding vector of relevant nodes?
Plain Text
import chromadb
from llama_index.core.storage.storage_context import StorageContext
from llama_index.vector_stores.chroma import ChromaVectorStore
from llama_index.embeddings.huggingface import HuggingFaceEmbedding 
from llama_index.core import (
    Document,
    VectorStoreIndex,
    Settings
)

documents = Document(
    text=text,
    metadata={
        'filename': file_name,
        'keyword': keyword,
    }
)

...

vector_store = ChromaVectorStore(chroma_collection=collection)
storage_context = StorageContext.from_defaults(vector_store=vector_store)
Settings.embed_model = HuggingFaceEmbedding(model_name='BAAI/bge-large-zh-v1.5')

index = VectorStoreIndex.from_documents(
    documents,
    storage_context=storage_context,
    Settings=Settings
)

...

retriever = index.as_retriever(similarity_top_k=3)
relevant_nodes = retriever.retrieve("my query")


I try to get vector:
Plain Text
relevant_nodes[0].get_embedding()

# Output
raise ValueError("embedding not set.")
ValueError: embedding not set.
2 comments
R
W