Find answers from the community

Updated 3 months ago

Pinecone

Hey there! I've been using a pinecone index for retrieval so far and now want to use llamaindex on top of pinecone. However, I'm getting a ValueError: Node content not found in metadata dict. when I try to query my query engine. My pinecone index is not empty. I'm new to llamaindex so still figuring it out. Can someone please point me in the right direction?

Here's my code so far:

Plain Text
vector_store = PineconeVectorStore(
    api_key=PINECONE_API_KEY,
    environment=PINECONE_ENVIRONMENT,
    index_name=PINECONE_INDEX
)

llm = OpenAI(temperature=0, model="gpt-4-1106-preview")
service_context = ServiceContext.from_defaults(llm=llm)

index = VectorStoreIndex.from_vector_store(vector_store=vector_store, service_context=service_context)

vector_index_retriever = VectorIndexRetriever(index)

query_engine = RetrieverQueryEngine(vector_index_retriever)

response = query_engine.query("Title: Phase III Clinical Trial of Novel Anticoagulant Compound Z in Adult Patients with Atrial Fibrillation")
response
L
Y
4 comments
Usually it works best if you built the index with pinecone

But otherwise, you can set a few values to make life easy

vector_store = PineconeVectorStore(..., text_key="<field with text>")
Thanks for responding! What do you mean by "built the index with pinecone"? This is an existing pinecone index I'm trying to use
Sorry typo -- built with llamaindex (using pinecone)
Add a reply
Sign up and join the conversation on Discord