Find answers from the community

Updated 3 months ago

Hi folks

Hi folks,
I have about 300k nodes. Trying to run the metadata_extractor on it.

nodes_metaextract = metadata_extractor.process_nodes(nodes)

Hitting OpenAI's RateLimitError. Any workarounds to this?

Alternatively, I suppose I could chunk and schedule the attempt to extract metadata, and schedule the task if the RateLimit is hit.

RateLimitError: Rate limit reached for default-gpt-3.5-turbo in organization org-0g4tBZCHdpzT3CoGj0Hdrlo0 on requests per day. Limit: 200 / day. Please try again in 7m12s. Contact us through our help center at help.openai.com if you continue to have issues.
L
T
A
4 comments
Your limit is 200 requests per day 🀯

It will take 4.1 years to process 300k nodes πŸ˜…
Yeah I think upgrading to paid plan makes sense πŸ˜…
Should only take a few days with that
xD It's kinda ridiculous. Actually, 200*10 RPD if I chunk the nodes and run them concurrently across the GPT 3.5+4 Family. Was mildly hoping to see if somebody else has figured out a better way.

I'm not sure if I want to dip into my Azure credits (RPDs / RPMs / TPMs are much better) as yet, 'Cause I have 2.5k in OpenAI that's set to expire in November.
Add a reply
Sign up and join the conversation on Discord