Find answers from the community

Updated 9 months ago

Observability

too bad I am using ts seems like this method is not supported there
W
s
3 comments
You can try using https://ts.llamaindex.ai/observability/

It will show LLM input/output/prompts and much more!
ty dude is there an option to see the logs locally?
i tried and It does not work locally.
Add a reply
Sign up and join the conversation on Discord