Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated 2 months ago
0
Follow
I want to know, what data do we end up
I want to know, what data do we end up
Inactive
0
Follow
P
PhilLiu
10 months ago
ยท
I want to know, what data do we end up sending to llm? How should I check? Please, thank you
W
1 comment
Share
Open in Discord
W
WhiteFang_Jr
10 months ago
You can try LlamaIndex observability feature:
https://docs.llamaindex.ai/en/stable/module_guides/observability/observability.html#observability
If you just want to know LLM input/output, Just do:
Plain Text
Copy
import llama_index # at the top llama_index.set_global_handler("simple")
If you want more, check Arize one, I have heard it is good
Add a reply
Sign up and join the conversation on Discord
Join on Discord