Hmm, let me try this locally
Hmm, seems to stream just fine for me
ok let me try on another computer ...
well AT FIRST it worked .... but I was on llama_index 0.7.4 (I think) ... upgraded to 0.7.17 and it stopped working (on that new computer) ...
Will try to downgrade ...
yup ... 0.7.4 streaming works ... not 0.7.17 ... trying now 0.7.16 ...
0.7.10 crashes with 'StreamingAgentChatResponse' object has no attribute 'print_response_stream'
And this is all with the same code above?
Mind blowing tbh, the latest version should be working. That's what I used above
How do you know it's no streaming?
0.7.11 same crash as 0.7.10
If the response is short enough, it might appear like it's not streaming
Could prompt it to write something longer, like a short story
yes, but there's also the "wait" ...
yes you can try with the "write me a poem" query
older versions won't have that attribute
Right, but the response type changed from StreamingResponse
to StreamingAgentResponse
, so that we can align the response types for agents/chat engines
I can't reproduce on my end, so I'm really not sure what the issue is π€
So to sum up: 0.7.5 was the last working ... from 0.7.6 to 0.7.11 : crash ... 0.7.12 and onwards no crash but no stream ...
Maybe a fresh venv would help
I'm not using venv (yes I know ... :p )
on bash:
python -m venv venv
source venv/bin/activate
pip install llama-index
nooooo don't make me do this π
Trust me, it makes python dev so much better
I used to be opposed too, but I've been converted haha
So I was told ... but no problem so far ... and ... that's another thing I need to "tame" ... π
Then you'll know for sure your env is using the right versions of packages π
ok ... I need to tame "venv" before going further ... I'm on Win11, folders are sync'd by dropbox across several work computers etc ... π
ok still not working (I think) ...
are you using WSL? or powershell?
ok, this works:
for token in response.response_gen:
print(token, end="", flush=True)
instead of
response.print_response_stream()
Notice the
flush=True
that forces flushing the output stream
I'm using "WT" which is the new "windows terminal" ...
actually super important haha. Will merge a PR shortly π
cool ! glad I could help ! π
Thanks for the patience and testing!
really loved your videos on the YT channel, waiting for more ...
also I might have a couple of other questions in the next few days π
Hey thanks! I'm just making videos as I have time to actually build the project out. Glad you like them!
And no worries, happy to help!