Find answers from the community

Updated 11 months ago

@Logan M I noticed the agent

@Logan M I noticed the agent implementations have been refactored by @jerryjliu0 , and when I try the usual OpenAI Agent implementation example provided in the docs, I can't seem to get it to work when I try astream_chat - just says STARTING TURN 1 and doesn't print anything beyond that. Am I doing something wrong, is there a new way to interact with things?
Attachment
image.png
L
V
14 comments
oh shoot, let me check on this -- I actually have no idea haha
Hmmm, this seemed to work for me
Just ran that same example from the notebook
maybe you ran into an API glitch?
It takes 60s to timeout by default
Yeah it looks like it was an API thing, thanks @Logan M !
Ran into a different issue shortly after, was testing chat_engine.astream_chat with chat_mode='openai', and got this error - ValueError: Streaming not supported for async
What's the full error?
Here's the stacktrace
I get -

Plain Text
=== Calling Function ===
Calling function: query_engine_tool with args: {
  "input": "Four lines about IMXRT1060RM"
}


and then the ValueError
Ah yes. That is an age-old issue.

We never setup the async + streaming in the RetrieverQueryEngine

I think it should technically be possible to implement now though
This is because the chat engine is using an underlying agent using the index as a query engine tool that is being called a retriever query engine, and that doesn't have async + streaming?
Add a reply
Sign up and join the conversation on Discord