Find answers from the community

Updated 2 months ago

Implementing Observability Functions in LiteralAI

Has anyone had luck implementing the observability funcs in https://docs.llamaindex.ai/en/stable/module_guides/observability ? I'm trying to get LiteralAI working and I see some very basic logs in my console but nothing on the cloud instance. Are there jupyter notebooks somewhere that I can be pointed to for a bit of context?
H
L
9 comments
I tried setting it up with set_global_handler with no luck. The literalai_callback_handler doesn't return an object and the create_global_handler assigns the handler (which is None) to the core.global_handler object
The docs make it seem like the callback/global handlers are being deprecated, so I'm not sure what the move is here.
callbacks are in limbo as things migrate to instrumentation

set_global_handler("literalai") sets up a bunch of instrumentation, and does not use the older callback system
At first glance, the source code seems fine to me
For the instrumentation, should I be setting up steps individually? I was under the impression that it would integrate with my other llama_index stuff seamlessly and that doesn't seem to be the case
I honestly think their service/integration might be broken

Following their actual guide, it doesn't log anything for me
https://github.com/Chainlit/literalai-cookbooks/blob/main/python/llamaindex-integration/RAG%20LlamaIndex.ipynb
I'd say the most used observability integrations are arize and langfuse, fyi
I'll look into those. Thanks!
following up as well -- literalai was working for me, but I needed to make an actual query first lol
Add a reply
Sign up and join the conversation on Discord