Find answers from the community

Updated 4 months ago

- how can I convert an existing neo4j knowledge graph into a llamaindex “property graph”?

At a glance

The post asks how to convert an existing Neo4j knowledge graph into a LlamaIndex "property graph". The comments provide a detailed answer, explaining the steps to follow:

1. Connect to the existing Neo4j graph by creating an instance of Neo4jPropertyGraphStore with the necessary credentials and URL.

2. Create a PropertyGraphIndex from the existing graph using the from_existing method, specifying the property_graph_store, language model (llm), and embedding model (embed_model).

The community members provide example code to demonstrate the process, using OpenAI's GPT-3.5-turbo as the language model and a small text embedding model. They also note that if the graph was created outside of LlamaIndex, the most useful retrievers will be "text to cypher" or "cypher templates".

The community members suggest referring to the LlamaIndex documentation for more details.

- how can I convert an existing neo4j knowledge graph into a llamaindex “property graph”?
k
3 comments
Please stand by @rawwerks as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
You can convert an existing Neo4j knowledge graph into a LlamaIndex "Property Graph" by following these steps:

  1. First, you need to connect to your existing Neo4j graph. You can do this by creating an instance of Neo4jPropertyGraphStore with your Neo4j credentials and URL. Here is an example:
Plain Text
from llama_index.graph_stores.neo4j import Neo4jPropertyGraphStore

graph_store = Neo4jPropertyGraphStore(
    username="neo4j",
    password="your_password",
    url="bolt://localhost:7687",
)

Replace "your_password" with your actual Neo4j password.

  1. Then, you can create a PropertyGraphIndex from the existing graph using the from_existing method. You need to specify the property_graph_store (which is the graph_store we just created), the language model (llm), and the embedding model (embed_model). Here is an example:
Plain Text
from llama_index.core import PropertyGraphIndex
from llama_index.embeddings.openai import OpenAIEmbedding
from llama_index.llms.openai import OpenAI

index = PropertyGraphIndex.from_existing(
    property_graph_store=graph_store,
    llm=OpenAI(model="gpt-3.5-turbo", temperature=0.3),
    embed_model=OpenAIEmbedding(model_name="text-embedding-3-small"),
)

In this example, we are using the OpenAI's GPT-3.5-turbo as the language model and a small text embedding model for the embedding model. You can replace these with the models of your choice.

  1. Now, you can use the index to insert more documents, retrieve nodes, and perform other operations.
Please note that if your graph was created outside of LlamaIndex, the most useful retrievers will be text to cypher or cypher templates. Other retrievers rely on properties that LlamaIndex inserts.
For more details, you can refer to the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord