experiencing this issue on llama-index 0.10.19 with all models, but most commonly with OpenAIAzure Encountered exception writing response to history: Connection error.
@Logan M We've switched everything over to be synchronous completely and the problem has gone away.
I will create some minimally reproducible examples for you later today, for both the claude issues I was experiencing and these issues
That way we can reach out to both anthropic for the fixes they need to do, and I can kind of get you some more insight to what's going on with the async code that may just need a little love