Find answers from the community

Updated 9 months ago

experiencing this issue on llama-index 0

experiencing this issue on llama-index 0.10.19 with all models, but most commonly with OpenAIAzure
Encountered exception writing response to history: Connection error.
L
m
21 comments
seems like an issue with overall connection to azure?

Testing llm.complete("Hello world") and confirming that works is probably a good first step
@Logan M
We've switched everything over to be synchronous completely and the problem has gone away.

I will create some minimally reproducible examples for you later today, for both the claude issues I was experiencing and these issues

That way we can reach out to both anthropic for the fixes they need to do, and I can kind of get you some more insight to what's going on with the async code that may just need a little love
That makes sense to me
In this case, it could just be sporadic issues with azure
πŸ‘ appreciate the support
Yeah azure is a piece of shit
But it happens with other models as well
Claude, and some of the hugging face once we have
It seems like there might be a lowdown async issue with the way that the async generator code interacts with the history
My guess is there's not a lot of people using async response gen
I think the other issue is it's swallowing the real traceback, making debugging a challenge
Even when I added Prints and stuff deep down I wasn't able to get much
That's where it's swallowed
If the try/except wasn't there, we could get a better traceback
Thank you I'll look into it
Add a reply
Sign up and join the conversation on Discord