Find answers from the community

Updated 3 months ago

Workflow

At a glance

The post asks about the interface, API, and client SDK for the LlamaIndex workflow. The community members explain that the LlamaIndex workflow itself does not have an interface, but rather it is a series of chained events that can be used to manipulate responses and outputs. However, another LlamaIndex product called Llama-deploy, which uses the workflow, does have a SDK that allows direct interaction with deployed services and the ability to use the workflow within it.

The community members confirm that both Llama-Deploy and LlamaIndex Workflow are open-source. They also state that the latest version of Llama-Deploy should be compatible with the latest version of the LlamaIndex Workflow, and that Llama-Deploy can serve multiple concurrent/asynchronous requests.

Useful resources
What is the interface to LlamaIndex workflow? Does it have an API for client to access? Is there a client sdk for it?
1
W
d
L
9 comments
Workflow doesn't have an interface, in simple terms workflow are the series of chained events that gives you extra advantage of manipulating the responses/output/actions.

https://docs.llamaindex.ai/en/stable/module_guides/workflow/

However another product of llamaindex: Llama-deploy which uses workflow has a SDK that allows you to interact with deployed services directly.
It allows you to use workflow within it, helping you to achieve a high level of customisation.

https://docs.llamaindex.ai/en/stable/module_guides/llama_deploy/#llama-deploy
Ok and both Llama-Deploy and LlamaIndex Workflow are open source?
Yep fully open-source :LlamaIndex:
And LlamaDeploy can use the latest version of LlamaIndex Workflow product? In terms of the workflows it deploys and manages?
These two products are in version sync ? As far a latest LlamaDeploy can use latest LlamaIndex Workflow ?
Or are there some version recommendations or compatibility considerations?
the llama-deploy package should work with the latest llama-index-core package yes
If I have multiple requests in a RAG deployed in llama-deploy, will it serve the users concurently/async or not?
It will yes (its async/concurrent by design)
Add a reply
Sign up and join the conversation on Discord