Find answers from the community

Updated 3 months ago

Hello! Might anyone know how I'd be able

Hello! Might anyone know how I'd be able to print-to-console the entire body of text that Llama-Index is sending to the LLM? Including the system prompt and the context and the query itself?

I've been scouring through the documentation and code and can't seem to locate it. (Though the full prompt shows up as part of the output when I included the code for Debugging and Tracing)
Thank you so much!
L
B
5 comments
Plain Text
from llama_index.core import set_global_handler

set_global_handler("simple")
put that at the top of your code
Woah awesome, thank you!!
I initially had that and the other debug code so it was showing everything haha

Do you know if there's a way to store this output except the response to a variable?
(So just System Prompt, Context, and Query would print)
To save to a variable is a bit more manual. You'll need to write some instrumentation hook to grab that

Thankfully, we have a guide on this
https://docs.llamaindex.ai/en/stable/examples/instrumentation/observe_api_calls/
Sweet, I'll read up on this. Thank you Logan!
Add a reply
Sign up and join the conversation on Discord