Find answers from the community

Updated 3 months ago

How to change the kg_triple_extract_template in KnowledgeGraphIndex

How to change the kg_triple_extract_template in KnowledgeGraphIndex
k
R
12 comments
Please stand by @RUPP as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
The kg_triple_extract_template is a parameter that you can set when you instantiate the KnowledgeGraphIndex class in LlamaIndex. This parameter is used to specify the prompt for extracting triplets in the knowledge graph.

Here is an example of how you can set this parameter:

Plain Text
from llama_index.indices.knowledge_graph import KnowledgeGraphIndex
from llama_index.prompts.base import BasePromptTemplate

# Define your custom prompt template
custom_template = BasePromptTemplate(...)

# Instantiate the KnowledgeGraphIndex with your custom template
kg_index = KnowledgeGraphIndex(kg_triple_extract_template=custom_template)


Please replace BasePromptTemplate(...) with your custom prompt template.

Remember that the kg_triple_extract_template parameter is optional. If you don't provide it, the KnowledgeGraphIndex will use a default template.

For more details, you can refer to the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai show an exemple
Let me check the knowledge base for you. One moment! :books:
----------------------
I'm sorry for the confusion, but the provided extracts do not include a specific example of changing the kg_triple_extract_template in KnowledgeGraphIndex. However, I can guide you on how you might do it based on the information available.

Here's a general example:

Plain Text
from llama_index.indices.knowledge_graph import KnowledgeGraphIndex
from llama_index.prompts.base import BasePromptTemplate

# Define your custom prompt template
custom_template = BasePromptTemplate(prompt="Your custom prompt here")

# Instantiate the KnowledgeGraphIndex with your custom template
kg_index = KnowledgeGraphIndex(kg_triple_extract_template=custom_template)


In this example, replace "Your custom prompt here" with the actual prompt you want to use for extracting triplets.

Please note that this is a hypothetical example and might not work as-is because the actual implementation of BasePromptTemplate and its usage is not provided in the extracts. For the exact usage, you should refer to the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Let me check the knowledge base for you. One moment! :books:
Sorry @RUPP, I seem to be having a bit of trouble. Don't worry though, I've pinged my supprt and they will try to fix me up! :technologist:

Perhaps try again in 15mins?
@kapa.ai Can't instantiate abstract class BasePromptTemplate with abstract methods format, format_messages, get_template, partial_format
Let me check the knowledge base for you. One moment! :books:
----------------------
The error message you're seeing is because BasePromptTemplate is an abstract base class in the LlamaIndex library. This means it contains methods that are intended to be overridden in subclasses, and it cannot be instantiated directly. The methods format, format_messages, get_template, and partial_format are abstract methods in this class, which means you need to provide implementations for these methods in your subclass.

Here's a general structure of how you might create a subclass of BasePromptTemplate:

Plain Text
from llama_index.prompts.base import BasePromptTemplate

class MyPromptTemplate(BasePromptTemplate):
    def format(self, llm=None, **kwargs):
        # Your implementation here
        pass

    def format_messages(self, llm=None, **kwargs):
        # Your implementation here
        pass

    def get_template(self, llm=None):
        # Your implementation here
        pass

    def partial_format(self, **kwargs):
        # Your implementation here
        pass


In this example, replace the # Your implementation here comments with the actual code for each method.

Please note that this is a hypothetical example and might not work as-is because the actual implementation of these methods is not provided in the extracts. For the exact usage, you should refer to the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai One of nodes or index_struct must be provided.
Add a reply
Sign up and join the conversation on Discord