Need help! 🥹
So my llm.stream_complete() call throws this error sometimes -> "requests.exceptions.ChunkedEncodingError: ("Connection broken: InvalidChunkLength(got length b'', 0 bytes read)", InvalidChunkLength(got length b'', 0 bytes read))"
The llm I am using is gpt-4. Any idea how to avoid this error?
Add a reply
Sign up and join the conversation on Discord