@Logan M @jerryjliu0 When using GPTSimpleVectorIndex to create a large file (over 90,000 rows) index, if openai api returns an error. Is there a way to continue uploading from breakpoints, or other mechanisms to create indexes in batches?
I tried to upload 900,000 words of text but got the following error "openai.error.APIError: Internal error { "error": { "message": "Internal error", "type": "internal_error", "param": null, "code": "internal_error" } }" , eventually resulting in a failure to run. Is there a control for sending interval? Or flow control?