Find answers from the community

Updated 3 months ago

Are there any helpers in Llama index

Are there any helpers in Llama index that we can utilize to check for rate limits for specific LLMs?

We are trying to utilize our Azure OpenAI rate limits to the fullest and we want to use OpenAI directly as a backup solution, would this be possible?
L
N
6 comments
hmm, there are already retries for rate limits. Although these aren't exactly user-configurable yet, been meaning to make this more configurable.

Beyond that, its just try/except and handling from there
(azure extends the openai class, so it gets these retries out of the box)
Cool, thanks πŸ™‚ So if we want to switch LLMs after hitting the rate limit too many times we should just catch the specific rate limit error (i dont even know if it has its own exception class)?
Yea that would be the approach I take. I thiiiink it should raise a specific error class (either something about retry or rate limits)
Add a reply
Sign up and join the conversation on Discord