Find answers from the community

Updated 2 months ago

Handling exceptions in workflow with asyncio

Hello, what is the proper way to handle exceptions in a Workflow? In the example generator function below, even though the exception gets caught, the workflows task exception seems to still bubble up to asyncio's default exception handler. Is this expected behavior?

Plain Text
async def event_generator():
        try:
            wf = MyWorkflow(timeout=30, verbose=True)
            handler = wf.run(user_query=topic["query"])

            async for ev in handler.stream_events():
                yield {"event": "progress", "data": ev.msg}

            final_result = await handler

            # Send final result message
            yield {"event": "workflow_complete", "data": final_result}

        except Exception as e:
            error_message = f"Error in workflow: {str(e)}"
            logger.error(error_message)
            yield {"event": "error", "data": error_message}
L
g
12 comments
Plain Text
import asyncio
from llama_index.core.workflow import Workflow, StartEvent, StopEvent, step



class DummyWorkflow(Workflow):
  @step
  def error_step(self, ev: StartEvent) -> StopEvent:
    raise ValueError("I raised")

async def main():
  try:
    workflow = DummyWorkflow()
    handler = workflow.run()

    async for ev in handler.stream_events():
      pass

    final_result = await handler
  except Exception as e:
    print(e)

if __name__ == "__main__":
  asyncio.run(main())
I ran that code, and it get
ValueError: I raised
The stack trace looked like it could be more helpful though
So it seems to an issue specific to the version of llama-index-core. Using llama-index-core 0.11.10 doesnt trigger the error, but 0.11.17 does.

Plain Text
I raised
Exception in callback Dispatcher.span.<locals>.wrapper.<locals>.handle_future_result(span_id='Workflow.run...-c8c97f8f7000', bound_args=<BoundArguments ()>, instance=<__main__.Dum...t 0x1169c00b0>, context=<_contextvars...t 0x116840240>)(<WorkflowHand...r('I raised')>) at pypoetry/virtualenvs/example-qJj8z0Hy-py3.12/lib/python3.12/site-packages/llama_index/core/instrumentation/dispatcher.py:273
handle: <Handle Dispatcher.span.<locals>.wrapper.<locals>.handle_future_result(span_id='Workflow.run...-c8c97f8f7000', bound_args=<BoundArguments ()>, instance=<__main__.Dum...t 0x1169c00b0>, context=<_contextvars...t 0x116840240>)(<WorkflowHand...r('I raised')>) at pypoetry/virtualenvs/example-qJj8z0Hy-py3.12/lib/python3.12/site-packages/llama_index/core/instrumentation/dispatcher.py:273>
Traceback (most recent call last):
  File "/usr/local/Cellar/python@3.12/3.12.6/Frameworks/Python.framework/Versions/3.12/lib/python3.12/asyncio/events.py", line 88, in _run
    self._context.run(self._callback, *self._args)
  File "pypoetry/virtualenvs/example-qJj8z0Hy-py3.12/lib/python3.12/site-packages/llama_index/core/instrumentation/dispatcher.py", line 281, in handle_future_result
    result = future.result()
             ^^^^^^^^^^^^^^^
ValueError: I raised
Yea I was using latest when I tested just now

Makes sense. Probably some fixes to error handling were made in the meantime
Oh got it, which version doesnt have the issue? I tried 0.11.18 and I still the same issue
Wait, what is the issue? πŸ˜… The lastest version produces the output above, which I think is what you want/expected?
(Also reproduced in google colab with 0.11.18, which works the way I would expect)
https://colab.research.google.com/drive/1NNdFlWvkmUO4fbHqBVXELM608LqM3-OW?usp=sharing
Sorry for the confusion πŸ˜… Since the workflow is wrapped in a try/except I was only expecting the exception message I raised to be printed, instead of it seeming to bubble up to the asyncio's default exception handler as an unhandled exception after printing the error message.
ahhh ok I see what you mean -- Yea ok, I see the issue, will require some patch to core (some changes were made to how instrumentation works for logging events/spans from workflows, seems result = future.result() shouldn't be called, need to check if the future has an error first
Awesome thanks Logan 😁
Add a reply
Sign up and join the conversation on Discord