Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 2 months ago
0
Follow
How do I output the prompt before it is
How do I output the prompt before it is
Inactive
0
Follow
f
frauas
11 months ago
Β·
How do I output the prompt before it is send to the llm for answer generation? I want to check the format and if everything is correct before sending to llm.
L
f
7 comments
Share
Open in Discord
L
Logan M
11 months ago
https://docs.llamaindex.ai/en/stable/module_guides/observability/observability.html#simple-llm-inputs-outputs
f
frauas
11 months ago
I added it, but when I run a query it just outputs the response. Where do I have to set it to return the prompt ?
Attachment
L
Logan M
11 months ago
you might have to add it at the start of your script, and restart the notebook
f
frauas
11 months ago
It worked, thanks a lot!
Does this handler also output the messagelist of a chat conversation?
L
Logan M
11 months ago
It should, assuming the
chat()
method is being called π
f
frauas
11 months ago
Do you know why the prompt is not being updated?
Attachment
f
frauas
11 months ago
Nevermind I had to move the update call a bit down. Works now. Thank you
Add a reply
Sign up and join the conversation on Discord
Join on Discord