Find answers from the community

Updated 2 months ago

Llama index workflows: modularization support and manual approaches

Hello, does Llama Index Workflows support modularization out of the box (splitting workflow steps into multiple files), or does it currently require a more manual approach? For example, defining a function with all the logic and then calling it with *args and **kwargs as parameters?
c
L
6 comments
Yes, that makes sense! I was curious if there’s a way to send a list of functions for the workflow to interpret as steps, rather than handling each step individually. But it’s not a big issue—just something I was curious about. Thank you! 🙂
Oh! You can do that as well (but your step definition will will have to follow the usual syntax of async def some_fn(ev: SomeEvent) -> SomeOtherEvent or async def some_fn(ctx: Context, ev: SomeEvent) -> SomeOtherEvent

For example
Plain Text
from llama_index.core.workflow import Context, StartEvent, StopEvent, Workflow, step

@step
def some_fn(ctx: Context, ev: StartEvent) -> StopEvent:
  return StopEvent(result="This came from a free function!")

w = Workflow()

w.add_step(some_fn)
I haven't seen this used in the wild tbh, so super open to feedback on the UX here
Thank you so much 🤘
qq: the step decorator needs a Workflow class to free functions, so if I pass llama_index.core.workflow.Workflow and then use the add_step instance method in a workflow instance, that is raising a validation error here -> https://github.com/run-llama/llama_index/blob/ef7207247e0884d48b05c85a710634b43dc04b05/llama-index-core/llama_index/core/workflow/workflow.py#L142

Is that the expected behaviour?
Ah ok, that explains why the example in the docs passes in a workflow instance

Dang, I thought I could skip that because it felt janky lol
https://docs.llamaindex.ai/en/stable/module_guides/workflow/#decorating-non-class-functions
Add a reply
Sign up and join the conversation on Discord