Find answers from the community

Updated 2 months ago

I am trying to load and query a persisting neo4j Graph DB.My current code to open the Gr

I am trying to load and query a persisting neo4j Graph DB.

My current code to open the Graph DB looks like this:

service_context = ServiceContext.from_defaults(llm=llm, chunk_size=512) graph_store = Neo4jGraphStore( username="neo4j", password="password", url="bolt://localhost:7687", database="neo4j", ) graph_storage_context = StorageContext.from_defaults(graph_store=graph_store) graph_index = KnowledgeGraphIndex( storage_context=graph_storage_context, kg_triplet_extract_fn=extract_triplets, service_context=service_context, verbose=True )
However, this only returns an error, which I do not know how to approach:

File "/home/fabian/Desktop/RAG/scripts/medium.py", line 118, in <module>
graph_index = KnowledgeGraphIndex(
File "/home/fabian/Desktop/RAG/.venv/lib/python3.10/site-packages/llama_index/indices/knowledge_graph/base.py", line 81, in init
super().init(
File "/home/fabian/Desktop/RAG/.venv/lib/python3.10/site-packages/llama_index/indices/base.py", line 47, in init
raise ValueError("One of nodes or index_struct must be provided.")
ValueError: One of nodes or index_struct must be provided.

Can you please help me and provide me with a working nsiplet of ocde?
k
f
9 comments
Please stand by @fabian as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
Based on the error message you provided, it seems like you need to provide either nodes or index_struct when initializing KnowledgeGraphIndex. However, in the code snippet you provided, neither of these parameters is given.

From the information extracts, I can provide a code snippet that shows how to initialize KnowledgeGraphIndex using documents loaded from a directory. This might not be exactly what you need, but it could give you an idea of how to proceed.

Here is the code snippet:

Plain Text
from llama_index import (
    KnowledgeGraphIndex,
    ServiceContext,
    SimpleDirectoryReader,
)
from llama_index.storage.storage_context import StorageContext
from llama_index.graph_stores import Neo4jGraphStore

# Load documents from a directory
documents = SimpleDirectoryReader(
    "../../../../examples/paul_graham_essay/data"
).load_data()

# Define LLM
llm = OpenAI(temperature=0, model="gpt-3.5-turbo")
service_context = ServiceContext.from_defaults(llm=llm, chunk_size=512)

# Define graph store
graph_store = Neo4jGraphStore(
    username="neo4j",
    password="password",
    url="bolt://localhost:7687",
    database="neo4j",
)

storage_context = StorageContext.from_defaults(graph_store=graph_store)

# Instantiate KnowledgeGraphIndex
index = KnowledgeGraphIndex.from_documents(
    documents,
    storage_context=storage_context,
    max_triplets_per_chunk=2,
    service_context=service_context,
)

This code snippet is based on the information extracted from here () and here ().
Please note that you need to replace "../../../../examples/paul_graham_essay/data" with the path to your documents. Also, you might need to adjust the max_triplets_per_chunk parameter according to your needs.

If this doesn't solve your problem, I would recommend checking the LlamaIndex documentation or reaching out to the LlamaIndex community for more specific help.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai This solution only works if i need to generate a NEW Graph Database. however, i want to load an existing one. Please provide a solution that does not use the .from_documents function. and instead loads the Graph DB from storage
Let me check the knowledge base for you. One moment! :books:
Sorry @fabian, I am experiencing especially high traffic right now, please try again. :technologist:
@kapa.ai his solution only works if i need to generate a NEW Graph Database. however, i want to load an existing one. Please provide a solution that does not use the .from_documents function. and instead loads the Graph DB from storage
Let me check the knowledge base for you. One moment! :books:
Sorry @fabian, I am experiencing especially high traffic right now, please try again. :technologist:
Add a reply
Sign up and join the conversation on Discord