Find answers from the community

Updated 5 months ago

Hey all, I am getting the following

At a glance

The community member is experiencing an error when using astream_chat_with_tools with BedrockConverse, where the async and stream functions worked on their own. The error message suggests that the issue is related to the use of for chunk in response["stream"] instead of async for chunk in response["stream"] in the llama_index/llms/bedrock_converse/base.py file. The community members discuss trying to fix the issue by making the suggested change, but they are still experiencing issues with no output to the terminal. They also ask if there is a way to use AsyncAnthropicBedrock in llamaindex when using Claude through Bedrock.

Hey all, I am getting the following error when using astream_chat_with_tools with BedrockConverse, where the async and stream functions worked on their own
D
L
9 comments
Traceback (most recent call last):
File "/home/ubuntu/poc/app_prod/app_prod/streaming.py", line 48, in <module>
asyncio.run(main())
File "/usr/lib/python3.11/asyncio/runners.py", line 188, in run
return runner.run(main)
^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/asyncio/runners.py", line 120, in run
return self._loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/asyncio/base_events.py", line 650, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "/home/ubuntu/poc/app_prod/app_prod/streaming.py", line 44, in main
async for chunk in stream_response():
File "/home/ubuntu/poc/app_prod/app_prod/streaming.py", line 37, in stream_response
async for chat_response in await llm.astream_chat_with_tools(tools, chat_history=chat_history):
File "/home/ubuntu/.cache/pypoetry/virtualenvs/app-prod-NgnrhRJP-py3.11/lib/python3.11/site-packages/llama_index/core/llms/callbacks.py", line 89, in wrapped_gen
async for x in f_return_val:
File "/home/ubuntu/.cache/pypoetry/virtualenvs/app-prod-NgnrhRJP-py3.11/lib/python3.11/site-packages/llama_index/llms/bedrock_converse/base.py", line 462, in gen for chunk in response["stream"]: File "/home/ubuntu/.cache/pypoetry/virtualenvs/app-prod-NgnrhRJP-py3.11/lib/python3.11/site-packages/aiobotocore/eventstream.py", line 10, in iter raise NotImplementedError('Use async-for instead')NotImplementedError: Use async-for instead
seems like llama_index/llms/bedrock_converse/base.py", line 462, should be async for chunk in response["stream"]: ?
Yup that's what it's throwing
Can you open /home/ubuntu/.cache/pypoetry/virtualenvs/app-prod-NgnrhRJP-py3.11/lib/python3.11/site-packages/llama_index/llms/bedrock_converse/base.py and make that change locally and see if it fixes it?

I don't have access to bedrock to test, but I can make the PR if thats actually the fix
yeah I will try that now
I just keep getting a hanging terminal with no output, I believe I am implementing it correctly but if there is a notebook somewhere that I can test maybe that would be best
Hmm, maybe that's not fully the issue then πŸ€”
When I make the change the original error does go away, but I get no output to terminal and after trying several different things in my script could not figure out why
Is there also a way to use AsyncAnthropicBedrock in llamaindex when using Claude through Bedrock?
Add a reply
Sign up and join the conversation on Discord