Find answers from the community

Updated 2 months ago

Once I have already created a Knowledge

Once I have already created a Knowledge Graph Index using the "from_documents" method, how can I just load the index from the existing Neo4j knowledge graph that I've created?
r
L
24 comments
I know the Property Graph Index has a "from_existing" method but I don't see that method in the Knowledge Graph Index
Knowledge graph index is kind of deprecated.

But in any case, unless you built the graph with llama-index, I think the only choice is text to cypher
Is there a more updated graph rag index I should use?
Property graph index is the go to
It's much more customizable
Does property graph also have the "include embeddings" option like the Knowledge graph index does?
It assumes embeddings. Probably a good idea to read up on the (very detailed) docs
https://docs.llamaindex.ai/en/stable/examples/property_graph/property_graph_neo4j/
I followed this guide to use the Property graph with Neo4j, but used ollama to run my own model instead of OpenAI. After running for a couple hours to create the graph, it errored out with this message:

Plain Text
neo4j.exceptions.ClientError: {code: Neo.ClientError.Procedure.ProcedureCallFailed} {message: Failed to invoke procedure `db.create.setNodeVectorProperty`: Caused by: java.lang.IllegalArgumentException: Vector must only contain finite values. Provided: SequenceValueVectorCandidate[sequence=List{}]}
Seems like an issue with embeddings?
I used Neo4j to store the embedding vectors but maybe that causes problems?
I followed the guide pretty much exactly with my own docs so I'm not sure what's wrong
I guess I just meant, it sounds like your embedding model returned inf values
Do you have any suggestions to ensure that the embeddings don't return inf values?
Not sure πŸ€” Normally this shouldn't happen actually. But I guess it depends what embedding model you were using? Which embed model was this?
I've tested with mxbai-embed-large:latest and all-minilm:33m
Both had the same issue
Is this code just only compatible with OpenAI embedding models?
But what embedding model class? HuggingFaceEmbedding?
It should work with any embedding model. But it seems like the one you are using has a bug
ollama_embedding = OllamaEmbedding(
model_name="all-minilm:33m",
base_url="http://localhost:11434"
)
So maybe Ollama Embedding class is the problem? I've tested multiple sentence transformers and they appear to all have the same issue
Might be an issue with ollama yea πŸ€” Have you updated the ollama server recently?
Yea I'm using ollama v1.4.8
Add a reply
Sign up and join the conversation on Discord