Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated 9 months ago
0
Follow
Observability
Observability
Inactive
0
Follow
s
shakedbuk
9 months ago
ยท
too bad I am using ts seems like this method is not supported there
W
s
3 comments
Share
Open in Discord
W
WhiteFang_Jr
9 months ago
You can try using
https://ts.llamaindex.ai/observability/
It will show LLM input/output/prompts and much more!
s
shakedbuk
9 months ago
ty dude is there an option to see the logs locally?
s
shakedbuk
9 months ago
i tried and It does not work locally.
Add a reply
Sign up and join the conversation on Discord
Join on Discord