Find answers from the community

Updated 3 months ago

Requestd

Is there a way to run multiple OpenAIAgents using the callback manager concurrently?
I have tried the following for one query to one OpenAIAgent 'top_agent':
Plain Text
thread = Thread(target=agent.stream_chat, args=(query))
thread.start()

Which works for one query but if I want to run multiple queries concurrently I tried the following using Threading:
Plain Text
def query_llama_index(query, agent):
        response = agent.stream_chat(query)

queries = ["riddle me this", riddle me that"]
for query in queries:
  thread = Thread(target=query_llama_index, args=(query, agent))
  thread.start()
  thread.join()

This runs however I receive no output from the Stream. My StreamingResponse looks like below:
Plain Text
def event_generator():
        queue = agent.callback_manager.handlers[0].queue

        # stream response
        while True:
            next_item = queue.get(True, 60.0)  # set a generous timeout of 60 seconds
            # check type of next_item, if string or not
            if isinstance(next_item, EventObject):
                yield convert_sse(dict(next_item))
            elif isinstance(next_item, StreamingAgentChatResponse):
                response = cast(StreamingAgentChatResponse, next_item)
                for text in response.response_gen:
                    yield convert_sse(text)
                break

        return StreamingResponse(event_generator(), media_type="text/event-stream")

Can OpenAIAgent's be used in multiple threads? If so, how do I 'queue' all these callbacks in the one queue and receive them?
L
1 comment
Oof, I'm not sure I'd use the callback manager for this πŸ˜…

Is there a reason to not use the response gen on the response object?
Add a reply
Sign up and join the conversation on Discord