Find answers from the community

Updated 3 months ago

@Logan M Need help! 🥹

Need help! 🥹
So my llm.stream_complete() call throws this error sometimes -> "requests.exceptions.ChunkedEncodingError: ("Connection broken: InvalidChunkLength(got length b'', 0 bytes read)", InvalidChunkLength(got length b'', 0 bytes read))"

The llm I am using is gpt-4. Any idea how to avoid this error?
L
1 comment
google tells me this might be a server-side issue with OpenAI 🤔
Add a reply
Sign up and join the conversation on Discord