Find answers from the community

Updated 3 months ago

I want to know, what data do we end up

I want to know, what data do we end up sending to llm? How should I check? Please, thank you
W
1 comment
You can try LlamaIndex observability feature: https://docs.llamaindex.ai/en/stable/module_guides/observability/observability.html#observability

If you just want to know LLM input/output, Just do:
Plain Text
import llama_index
# at the top
llama_index.set_global_handler("simple")


If you want more, check Arize one, I have heard it is good
Add a reply
Sign up and join the conversation on Discord