Find answers from the community

Updated 11 months ago

Hey, guys! Is it possible to log the

Hey, guys! Is it possible to log the streaming message via CallbackManager? I'd like to get trace with full answer from llm
L
r
6 comments
you cannot get streaming responses from the callback manager. It will only log once the stream is exhausted.

Checkout out some of our observability integrations here (I personally like arize and openllmetery)https://docs.llamaindex.ai/en/stable/module_guides/observability/observability.html#arize-phoenix
If callback manager can log when the stream is exhausted maybe it's possible to use the same trace_id?
I'm looking the code, I believe it's possible to handle response the same way it's implemented for history:
Plain Text
            # override the generator to include writing to chat history
            self._memory.put(ChatMessage(role=MessageRole.USER, content=message))
            response = StreamingAgentChatResponse(
                chat_stream=response_gen_from_query_engine(query_response.response_gen),
                sources=[tool_output],
            )
            thread = Thread(
                target=response.write_response_to_history, args=(self._memory, True)
            )
            thread.start()
I see then arize shows response in their trace. However I don't understand how they catch it ) I'm trying to implement my own handler with LlamaDebugHandler
They should be catching when the llm event hits on_event_end πŸ‘€
Wow, thanks a lot! I've just implemented my own LlamaDebugHandler with custom on_event_end and it works like a charm πŸ™‚
Add a reply
Sign up and join the conversation on Discord