Find answers from the community

Updated 4 weeks ago

I'm getting a ZeroDivisionError when trying to print out LLM context and prompts with the LlamaDebugHandler.

Hi, I am trying to use https://docs.llamaindex.ai/en/stable/examples/observability/LlamaDebugHandler/ to print out LLM context and prompts, but I am getting the following. This is with identical code - running from colab.
--------------------------------------------------------------------------- ZeroDivisionError Traceback (most recent call last) <ipython-input-25-4da50f59adab> in <cell line: 2>() 1 # Print info on the LLM calls during the summary index query ----> 2 print(llama_debug.get_event_time_info(CBEventType.LLM)) 1 frames /usr/local/lib/python3.10/dist-packages/llama_index/core/callbacks/llama_debug.py in _get_time_stats_from_event_pairs(self, event_pairs) 130 return EventStats( 131 total_secs=total_secs, --> 132 average_secs=total_secs / len(event_pairs), 133 total_count=len(event_pairs), 134 ) ZeroDivisionError: float division by zero
W
1 comment
Hey Are you on latest version for Llama-index?
Also I would suggest you to use Instrumentation module: https://docs.llamaindex.ai/en/stable/module_guides/observability/instrumentation/
Add a reply
Sign up and join the conversation on Discord