Hello, is it possible for
OpenAIAgent
function execution to be concurrent, especially in asynchronous environment? Note that this is different from parallel function calling. What I am looking for is the way to execute those parallel func calls concurrently. Something like asyncio gather. Thanks!!
Relevant docs:
https://docs.llamaindex.ai/en/stable/examples/agent/openai_agent_parallel_function_calling/#example-from-openai-docsI tried to put asyncio.sleep inside
get_current_weather
(modified to be async) to confirm that in fact the function executions are not concurrent.