Find answers from the community

Updated 5 months ago

turns out claude responses can be empty

turns out claude responses can be empty if you are sending completed messages sequence. I monkey patched an interceptor there.
otherwise:-

Plain Text
File ~/.local/lib/python3.10/site-packages/llama_index/llms/bedrock/utils.py:150, in AnthropicProvider.get_text_from_response(self, response)
    149 def get_text_from_response(self, response: dict) -> str:
--> 150     return response["content"][0]["text"]

IndexError: list index out of range

should probably have some secondstage validation.


temporary money patch fixed:
Plain Text
import llama_index.llms.bedrock.utils
def f(self, response):
    if len(response["content"]) > 0:
        return response["content"][0]["text"]
    else:
        return ''
llama_index.llms.bedrock.utils.AnthropicProvider.get_text_from_response = f
Attachment
image.png
R
W
s
10 comments
what's the context ? i used claude with llama-index agents and retrieval pipelines. never had that issue
Seems like this is happening from bedrock space
i am connecting directly to anthropic's api
so that must be why
tbh it was very much an edge case I was testing something else on. I was taking max_token as 5.
and making recursive calls to generate a complete output by appending the previous generations.
lemme clean up the notebook and share it with you guys.
not sure about the anthropic or gcp side, but it is an edgecase in bedrock at least.
what is the objective (I know it's besides the point)
generating texts larger than 4k tokens
Add a reply
Sign up and join the conversation on Discord