Find answers from the community

Updated 3 weeks ago

Modifying Prompts with Custom Metadata from Source Nodes

@Logan M I am referencing this thread that you replied to from last week: https://discord.com/channels/1059199217496772688/1059200010622873741/1304241311255101451
***
I still want to modify my prompt with custom metadata from each source node. Something like this:
Plain Text
Please answer only with the information given in context:
"""
Metadata: 
{ "book": "xyz", "page": "3" }
This is a chapter on the history of Rome etc
{ "book": "gef", "page": "374" }
This is a chapter on Athens etc ...
"""
Question: {} 

What is the best approach on how to create a custom template?
h
L
4 comments
@Logan M Ahhh... I am using a query engine.
... will provide some code soon when i get a chance.
From the response itself, I can get the nodes and source text etc, but how do i get the entire context/prompt that is used to generate the response? (with all the values filled in from the prompt templates)
Do you want it programmatically or just print to console?

Printing is easy, just put this at the top of your code

Plain Text
from llama_index.core import set_global_handler 
set_global_handler("simple")
@Logan M
a) you are always super helpful - thank you πŸ™‚
b) how do i do this programmatically?
a little more annoying programmatically. A few options

  • take the source nodes and format them into the prompt yourself
Plain Text
from llama_index.core.prompts.default_prompt_selectors import DEFAULT_TEXT_QA_PROMPT_SEL

# get source nodes
nodes = retriever.retrieve(nodes)
# or
nodes = response.source_nodes

context_str = "\n\n".join([n.node.get_content(metadata_mode="llm") for n in nodes])
prompt = DEFAULT_TEXT_QA_PROMPT_SEL.format(llm=llm, context_str=context_str, query_str=query_str)
print(prompt)

  • you could also do the same as the above, and just actually call the LLM on the prompt, removing the need for the query engine (NOTE: this removes the handling for when the retrieved nodes are greater than the context window)
Add a reply
Sign up and join the conversation on Discord