Find answers from the community

Updated 3 months ago

What could be causing` trace_map` in

What could be causing trace_map in end_trace to be None? In attempting to add arize-phoenix to privateGPT I'm noticing significantly different behavior than the toy project i created to test phoenix. For some reason the traces never get emmited in privateGPT and it seems to be because OpenInferenceTraceCallbackHandler.end_trace() is not called with trace_map.

I'm struggling to find where end_trace is called from as well
L
a
22 comments
Since most traces are started with a decorator, end_trace is usually called in the context manager here

https://github.com/run-llama/llama_index/blob/d02d59d934df19e66fa91cc6baea8550b370595c/llama_index/callbacks/base.py#L196
What version of llama-index are you using? There was an older issue related to this where if a trace wasn't started before an event got logged, some issues might happen.

Now, if an event is logged before a trace gets logged, it starts a default trace
https://github.com/run-llama/llama_index/blob/d02d59d934df19e66fa91cc6baea8550b370595c/llama_index/callbacks/base.py#L89

Although now I'm maybe seeing an issue with that logic, because starting a trace that way, where does end_trace get called πŸ˜…
going to make a PR to fix that...
Ah I see what you mean, not sure thats the issue though because end_trace is in fact getting called in both projects.

I added some print statments

Plain Text
@graceful_fallback(_null_fallback)
    def end_trace(
        self,
        trace_id: Optional[str] = None,
        trace_map: Optional[ChildEventIds] = None,
    ) -> None:
        if not trace_map:
            print("trace_map is None")
            return  # TODO: investigate when empty or None trace_map is passed
        print (f"adding spans")
        _add_spans_to_tracer(
            event_id_to_event_data=self._event_id_to_event_data,
            trace_map=trace_map,
            tracer=self._tracer,
        )
        self._event_id_to_event_data = defaultdict(lambda: CBEventData())


trace_map is always None is the privateGPT project and not none in the toy tester project built in the same conda env
hmm, is it maybe because something in privateGPT isn't propagating the callback manager to all components somehow?
not 100% sure what the issue could be.... I haven't used privateGPT at all before
I'm assuming it is an issue within privateGPT but am wondering if you can point me in the right direction so I know what kinds of things could cause the callback manager to not be exposed to the components... through it seems like it is exposed because on_event_start and on_event_end are called multiple times
Also, when i use

Plain Text
llama_index.set_global_handler("simple")

Everything works correctly (emmiting events to the terminal and such) which seems to indicate that the callback manager is available to the components
Yea that's true..

It really depends on how llama-index is being used in privategpt. Is it using just the query engine snf chat engine entry points? Is it directly calling the llm object?
Among other things its using the SimpleChatEngine.stream_chat() function which is all i'm trying to get observability on for the moment
That's interesting, the stream_chat() method there has the trace decorator, so tracing should be happening properly πŸ€”
and again, it is infact calling the handlers methods like on_event_start and on_event_end. The issue is that once end_trace gets called, the trace_stack is None so it doesn't emmit anything
Ah i see I see. I think I see why
I've got print statements in pretty much every function now and everything seems to be happening idientically in my two projects except when it gets to end_trace
Simple chat engine only calls llm, but the llm is considered a leaf statement, so it doesn't modify the trace stack
I can probably confirm that in a bit
But start_event and end_event have checks for leaf events
Okay so i added some logging and I am now very confused. It looks like the events are being added to the tracer but when end_trace gets called somehow self._trace_map is empty

Plain Text
adding to trace map in base: dd3e7ed6-a2c8-42aa-bd8d-fa1ae636d701 -> root
trace map is now {"root": ["dd3e7ed6-a2c8-42aa-bd8d-fa1ae636d701"]}
 in on_event_start 
adding to trace map in base: fc6d5d59-09df-401d-bc33-728d8fed509d -> root
trace map is now {"root": ["dd3e7ed6-a2c8-42aa-bd8d-fa1ae636d701", "fc6d5d59-09df-401d-bc33-728d8fed509d"]}
 in on_event_start 
stream_chat was called
adding to trace map in base: 1adb7661-bbd7-4b21-aba2-4aea9b05b3d9 -> root
trace map is now {"root": ["dd3e7ed6-a2c8-42aa-bd8d-fa1ae636d701", "fc6d5d59-09df-401d-bc33-728d8fed509d", "1adb7661-bbd7-4b21-aba2-4aea9b05b3d9"]}
ending trace: chat, trace_map: defaultdict(<class 'list'>, {}), handler: <phoenix.trace.llama_index.callback.OpenInferenceTraceCallbackHandler object at 0x2a3a6db90>
 in on_event_start 
json stringified trace_map: {}
trace_map is None
This is just for simplechatengine ?
Or whats the easiest way to reproduce this?
Add a reply
Sign up and join the conversation on Discord