Find answers from the community

Updated 5 months ago

Streaming

Hey guys!

Just returned to dev with Llama index after more recent support from Anthropic for streaming (in-combination with tool-calls) has been introduced; however, I haven't found an agent setup that supports this workflow.

For example...
Plain Text
agent_worker = FunctionCallingAgentWorker.from_tools(
    [multiply_tool, add_tool],
    llm=llm,
    verbose=True,
    allow_parallel_tool_calls=False,
)
agent = agent_worker.as_agent()

Seems to be the generally accepted flow, although FunctionalCallingAgentWorker and it's AgentRunner don't currently support streaming.

Has anyone found a workaround for this usecase?
L
N
4 comments
Confused what you mean that it doesn't support streaming? agent.stream_chat(...) no?
It exists in the documentation, but upon running it gives a NotImplemented error : (
Using ReActAgent alternatively seems to support streaming with it's subclasses of AgentRunner
Hmm πŸ€” maybe it's not actually implemented yet
Add a reply
Sign up and join the conversation on Discord