Find answers from the community

O
OUTYUA
Offline, last seen last month
Joined November 6, 2024
O
OUTYUA
·

Release

When is the next release
15 comments
O
L
Hello, everybody. there is a question(maybe a bug) in my app, when using Workflow, I sent event to stream, and in the biz logic, I use steam_events to receive it, but when the timeout occurs, I can not catch the exception, and the steam_events was blocked. Here is my code, thanks for your help.

Plain Text
llm = get_llm()

    wf = ChatWorkflow(
        ctx=context,
        llm=llm,
        timeout=3,
        verbose=True,
    )

    async def stream_response():
        handler = wf.run()
        try:
            async for event in handler.stream_events():
                print("---->>>> ", event)
                if isinstance(event, ChatEvent):
                    yield SSEResponse.format_string(
                        SSEResponse.EventType.DATA,
                        event.message.model_dump(),
                    )
        except Exception as e:
            logger.error(f"Chat workflow error: {str(e)}")
            yield SSEResponse.format_string(
                SSEResponse.EventType.DATA,
                {"message": "server error"},
            )
        await handler

    return SSEResponse.send(stream_response())
14 comments
O
L
O
About llama-index. Workflow: are there any principles or design drafts?
1 comment
W