Find answers from the community

Updated 4 months ago

hello guys,

At a glance
hello guys,

I'm currently working with OpenAI Agent with multiple query engine tools.

I want to stream query engine tool result directly from the OpenAI Agent,

however, though I set streaming as True, OpenAI Agent stream output after the response from the query engine completed.
Does anybody have soution or idea for this issue?

the query engine tool defined as below,

query_engine: RetrieverQueryEngine = self._as_query_engine(
similarity_top_k=similarity_top_k,
vector_store_query_mode=vector_store_query_mode,
only_live=only_live,
llm=llm or Settings.cheap_llm,
node_postprocessors=node_postprocessors,
response_mode=response_mode,
text_qa_template=text_qa_template,
refine_template=refine_template,
summary_template=summary_template,
simple_template=simple_template,
use_rerank=use_rerank,
rerank_top_k=rerank_top_k,
rerank_template=rerank_template,
use_async=use_async,
streaming=streaming,
verbose=verbose,
)

return QueryEngineTool.from_defaults(
query_engine=query_engine,
name=tool_name,
description=tool_description,
return_direct=return_direct,
)
L
L
R
6 comments
Are you using an LLM that supports Streaming?
Also, I'm not convinced that async=true and streaming can live together
You also can't stream a tool call (at least that I'm aware), because the tool call itself isn't user facing in an agent πŸ€”
Hello @Leonardo Oliva

I have a use case where use_async= True and streaming = True. You know any workaround or idea to proceed further?
just try to use_async=False
Hi @Leonardo Oliva

Turning off async works. But I am building a UI for my chat engine. The UI seems to be bit more interactive when I use async methods.

Following is the code snippet:

response = await chat_engine.astream_chat(query)

await response.aprint_response_stream()

The above code works. But there is no streaming output for some reason 🀷🀷
Add a reply
Sign up and join the conversation on Discord