Find answers from the community

Updated 5 months ago

Implementing Observability Functions in LiteralAI

At a glance

The community member is trying to implement the observability functions in the LlamaIndex documentation, specifically for the LiteralAI integration. They are seeing some basic logs in the console but nothing on the cloud instance. The community members discuss various approaches, such as using set_global_handler and the literalai_callback_handler, but they are not having success. The documentation suggests that the callback/global handlers are being deprecated, and the community members are unsure of the recommended approach. Some community members suggest using other observability integrations like Arize and Langfuse, and one community member notes that they needed to make an actual query for LiteralAI to start logging. However, there is no explicitly marked answer in the comments.

Useful resources
Has anyone had luck implementing the observability funcs in https://docs.llamaindex.ai/en/stable/module_guides/observability ? I'm trying to get LiteralAI working and I see some very basic logs in my console but nothing on the cloud instance. Are there jupyter notebooks somewhere that I can be pointed to for a bit of context?
H
L
9 comments
I tried setting it up with set_global_handler with no luck. The literalai_callback_handler doesn't return an object and the create_global_handler assigns the handler (which is None) to the core.global_handler object
The docs make it seem like the callback/global handlers are being deprecated, so I'm not sure what the move is here.
callbacks are in limbo as things migrate to instrumentation

set_global_handler("literalai") sets up a bunch of instrumentation, and does not use the older callback system
At first glance, the source code seems fine to me
For the instrumentation, should I be setting up steps individually? I was under the impression that it would integrate with my other llama_index stuff seamlessly and that doesn't seem to be the case
I honestly think their service/integration might be broken

Following their actual guide, it doesn't log anything for me
https://github.com/Chainlit/literalai-cookbooks/blob/main/python/llamaindex-integration/RAG%20LlamaIndex.ipynb
I'd say the most used observability integrations are arize and langfuse, fyi
I'll look into those. Thanks!
following up as well -- literalai was working for me, but I needed to make an actual query first lol
Add a reply
Sign up and join the conversation on Discord