Hey Queso, llama index actually uses a single prompt to extract triplets across text chunks. It should make more sense when you check out the default prompt for this here:
https://github.com/jerryjliu/llama_index/blob/main/gpt_index/prompts/default_prompts.py#L241Also, check out the demo notebook as well:
https://github.com/jerryjliu/llama_index/blob/main/examples/knowledge_graph/KnowledgeGraphDemo.ipynbAt query time, llama index extracts keywords from the query and look for triplets that have the same subject as each keyword.
If include_text is true (the deafult) llama index include the triplet and the text chunk where the triplet was found when we ask the LLM to answer the query