Find answers from the community

Home
Members
Avinaash Anand
A
Avinaash Anand
Offline, last seen 3 months ago
Joined September 25, 2024
Hi folks,
I have about 300k nodes. Trying to run the metadata_extractor on it.

nodes_metaextract = metadata_extractor.process_nodes(nodes)

Hitting OpenAI's RateLimitError. Any workarounds to this?

Alternatively, I suppose I could chunk and schedule the attempt to extract metadata, and schedule the task if the RateLimit is hit.

RateLimitError: Rate limit reached for default-gpt-3.5-turbo in organization org-0g4tBZCHdpzT3CoGj0Hdrlo0 on requests per day. Limit: 200 / day. Please try again in 7m12s. Contact us through our help center at help.openai.com if you continue to have issues.
4 comments
A
T
L