I keep running into an issue where whenever I set
use_async=True
on for my query engine, I encounter an error of
asyncio.exceptions.CancelledError
The relevant portions of the stack seem to be
File "/path/to/lib/python3.11/site-packages/llama_index/query_engine/sub_question_query_engine.py", line 226, in _aquery_subq
response = await query_engine.aquery(question)
...
File "/path/to/lib/python3.11/site-packages/llama_index/response_synthesizers/refine.py", line 325, in aget_response
response = await self._agive_response_single(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/path/to/lib/python3.11/site-packages/llama_index/response_synthesizers/refine.py", line 430, in _agive_response_single
structured_response = await program.acall(
^^^^^^^^^^^^^^^^^^^^
File "/path/to/lib/python3.11/site-packages/llama_index/response_synthesizers/refine.py", line 79, in acall
answer = await self._llm.apredict(
...
File "/path/to/lib/python3.11/site-packages/openai/_base_client.py", line 1536, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/path/to/lib/python3.11/site-packages/openai/_base_client.py", line 1315, in request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "/path/to/lib/python3.11/site-packages/openai/_base_client.py", line 1339, in _request
response = await self._client.send(