I'm encountering difficulties while attempting to integrate Llama-Index into a Celery task for concurrent processing. Specifically, I'm facing issues with vector creation using the VectorStoreIndex class within Celery tasks. I'm seeking clarification on how to properly use Llama-Index with Celery concurrency or insights into why vector creation fails within Celery tasks.
When attempting to create VectorStoreIndex instances within Celery tasks for concurrent processing, the process gets stuck at the vector creation step. Despite using Celery's concurrency features and ensuring proper task execution, the vector creation process doesn't proceed as expected.
document = Document(text=str(data))
index = VectorStoreIndex.from_documents([document], service_context=self.service_context)