Find answers from the community

Updated 4 weeks ago

Streaming mode limitations with Anthropic tools

hey anyone using Anthropic with tools? It seems a little buggy - im getting this back "Tools are not supported in streaming mode"
L
4 comments
How are you using anthropic though?
Plain Text
>>> from llama_index.core.tools import FunctionTool
>>> def get_phone_number(name: str) -> str:
...   """Useful for looking up a phone number for a name."""
...   return "3067744219"
... 
>>> tool = FunctionTool.from_defaults(fn=get_phone_number)

>>> from llama_index.llms.anthropic import Anthropic
>>> from llama_index.core.llms import ChatMessage
>>> llm = Anthropic(model="claude-3-5-sonnet-20241022")
>>> resp = llm.stream_chat_with_tools([tool], chat_history=[ChatMessage(role="user", content="What is Logan Markewich's phone number?")])
>>> for r in resp:
...   pass
... 
>>> print(r)
assistant: I'll help you look up Logan Markewich's phone number using the get_phone_number function.
>>> tool_calls = llm.get_tool_calls_from_response(r, error_on_no_function_call=False)
>>> tool_calls[0]
ToolSelection(tool_id='toolu_018Xhgfeu2v83ALxjx2hVjBe', tool_name='get_phone_number', tool_kwargs={'name': 'Logan Markewich'})
I think streaming wasn't added to the FunctionCallingAgent yet
I'm using the lower-level (and arguably more useful) functions above
Add a reply
Sign up and join the conversation on Discord