Find answers from the community

Updated 3 months ago

Following the example for Graph Store

Following the example for Graph Store with a Custom LLM, I was able to build the indices, but when running query, it seems to hang

response = query_engine.query("Tell me more about Interleaf")

I have only implemented metadata and complete function in CustomLLM. What could be the reason behind the hang?
L
h
18 comments
Maybe the input was too big for your LLM. You maybe have to decrease the context window
Doesn't look to be it, i double checked the output and context windows do not exceed.

index = KnowledgeGraphIndex.from_documents(
documents,
storage_context=storage_context,
max_triplets_per_chunk=2,
service_context=service_context,
)

this was able to run successfully with the LLM
nevermind, i waited long enough for it to time out
LookupError:
**
Resource stopwords not found.
Please use the NLTK Downloader to obtain the resource:

import nltk
>>> nltk.download('stopwords')

For more information see: https://www.nltk.org/data.html

Attempted to load corpora/stopwords

Searched in:
  • '/home/ml/nltk_data'
  • '/home/ml/neo4nan/myenv/nltk_data'
  • '/home/ml/neo4nan/myenv/share/nltk_data'
  • '/home/ml/neo4nan/myenv/lib/nltk_data'
  • '/usr/share/nltk_data'
  • '/usr/local/share/nltk_data'
  • '/usr/lib/nltk_data'
  • '/usr/local/lib/nltk_data'
  • '/tmp/llama_index'
  • '/home/ml/llamaindex_cache'
**
one more nltk to download
So I could find the entities in my neo4j, and I ran response = query_engine.query("Tell me about Interleaf"), but I am getting this:
WARNING:llama_index.indices.knowledge_graph.retrievers:Index was not constructed with embeddings, skipping embedding usage...
Index was not constructed with embeddings, skipping embedding usage...
INFO:llama_index.indices.knowledge_graph.retrievers:> No relationships found, returning nodes found by keywords.
No relationships found, returning nodes found by keywords.
INFO:llama_index.indices.knowledge_graph.retrievers:> No nodes found by keywords, returning empty response.
No nodes found by keywords, returning empty response.
query_engine is like the example query_engine = index.as_query_engine(include_text=False, response_mode="tree_summarize")
It looks like there was no keywords in your query (and hence, no triplets to be found)
Am i misunderstanding what keyword mean? If Interleaf is in the query, and I do have the entity and its triple in neo4j
Attachment
image.png
The query is "Tell me about Interleaf"
There is a step that has to extract keywords from the query. It usually prints out the keywords it found
I am using the same query and document, howeverthe keyword is found in the example for llamaindex, did I miss something?
Its dependent on the LLM -- maybe your LLM was not able to find the keywords? I remember you said you weren't using openai
It's a llama2 vanilla, the training is still going on so i am using the open source version at the moment, let me take a look at keywords
looks it's i am better off going question -> cypher now
actually may you point to the code where the keyword fetching occurs when you have time?
Add a reply
Sign up and join the conversation on Discord