----------------------
Yes, you can create multiple instances of
chat_engine
for each user to handle simultaneous chats. However, it's important to note that LlamaIndex also supports concurrent chats. This means that you can have multiple users chatting simultaneously with a single instance of
chat_engine
.
Here's an example of how you can handle concurrent chats:
from concurrent.futures import ThreadPoolExecutor
import asyncio
def run_in_event_loop(func: Callable):
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.run_until_complete(func())
loop.close()
async def run_task_n_times(task: Callable, n: int):
def thread_task():
run_in_event_loop(task)
# Run two calls to run_chat concurrently and wait for them to finish
with ThreadPoolExecutor(max_workers=5) as executor:
futures = [executor.submit(thread_task) for _ in range(n)]
for future in futures:
future.result()
await run_task_n_times(run_chat, 5)
In this example,
run_chat
is a function that handles a single chat interaction.
run_task_n_times
is used to run
run_chat
concurrently for
n
times. This allows multiple users to chat simultaneously with the same
chat_engine
.
You can find more details about this in the
TestConcurrency notebook () from the LlamaIndex GitHub repository.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------