Hey guys!
Just returned to dev with Llama index after more recent support from Anthropic for streaming (in-combination with tool-calls) has been introduced; however, I haven't found an agent setup that supports this workflow.
For example...
agent_worker = FunctionCallingAgentWorker.from_tools(
[multiply_tool, add_tool],
llm=llm,
verbose=True,
allow_parallel_tool_calls=False,
)
agent = agent_worker.as_agent()
Seems to be the generally accepted flow, although
FunctionalCallingAgentWorker
and it's
AgentRunner
don't currently support streaming.
Has anyone found a workaround for this usecase?