Find answers from the community

Updated 6 months ago

Hi, I've got a problem with llamaindex,

At a glance

The community member has an existing Neo4j graph with nodes containing text and embedding properties. They want to integrate this graph with LlamaIndex, but are encountering an error related to the metadata structure expected by LlamaIndex. The community members suggest creating the embeddings again using LlamaIndex's recommended approach, as the structure for storing the content might be different from the one used in the existing application. They discuss how to create the nodes and index in Neo4j, and a community member suggests following the LlamaIndex documentation on using the Neo4jVectorStore to ensure compatibility. The community members also discuss the possibility of creating relationships between the nodes generated by LlamaIndex and the existing nodes in the graph.

Useful resources
Hi, I've got a problem with llamaindex, what is the best channels to ask for help?
W
o
13 comments
This is one of the place. Shoot your query
I've got an existing neo4j graph with some nodes and relationships between them.

The nodes "Chunks" have the following properties:
  • text: wich contains a text chunk
  • embedding: which contains a vector representation (1536) dimension of text
Now I want to attach LlamaIndex with this existing graph and the existing index build with neo4j.

So i can do the following:

neo4j_vector = Neo4jVectorStore(
embedding_dimension=1536,
username=os.getenv("NEO4J_USERNAME"),
password=os.getenv("NEO4J_PASSWORD"),
url=os.getenv("NEO4J_URL"),
index_name=VECTOR_INDEX_NAME,
text_node_property=VECTOR_SOURCE_PROPERTY,
embedding_node_property='embedding',
node_label=VECTOR_NODE_LABEL,
)

retriever = VectorStoreIndex.from_vector_store(neo4j_vector).as_retriever()

retriever.retrieve("my query").

But I've got the following error:
ValueError("Node content not found in metadata dict.")

I think that llama is expecting a metadata dictionary in the nodes (because i didn't creare the embeddings and the nodes with llama).

How can i resolve this?
I think best way to resolve this would be to create the embeddings again as the structure for storing the content might be diff for other application that you may have used
That is my idea, but how can i do that?
I would like to use llama to iterate over my neo4j nodes and then add the structure and embedding that llama wants
How did you created these nodes
using the neo4j python driver
then, i create the index with this query:
CREATE VECTOR INDEX chunks_embedding IF NOT EXISTS
FOR (c:Chunk) ON (c.embedding)
OPTIONS { indexConfig: {
vector.dimensions: 1536,
vector.similarity_function: 'cosine'
}}
MATCH (c:Chunk) WHERE c.text IS NOT NULL
WITH c, genai.vector.encode(
c.text,
"OpenAI",
{
token: MY_TOKEN,
endpoint: "https://api.openai.com/v1/embeddings"
}) AS vector
CALL db.create.setNodeVectorProperty(c, "embedding", vector)
No idea about this tbh, I would suggest you create the nodes following this doc: https://docs.llamaindex.ai/en/stable/examples/vector_stores/Neo4jVectorDemo/?h=neo4j
That way it will be compataible with everything you do with llamaindex
Ok got it, but then i need to create some relationships between the nodes generated by llama and other nodes inside the graph, is it possible right?
Add a reply
Sign up and join the conversation on Discord