Find answers from the community

Updated 11 months ago

How can in make streaming response ,

How can in make streaming response , arrive to my front in real time?
W
L
2 comments
You can take example https://github.com/run-llama/sec-insights

They have frontend and backend both.

Check it live here: https://www.secinsights.ai/
That example makes use of server-sent events.
Another approach could be websockets. Open a websocket on the back with FastAPI and send the delta created by the .stream_response() generator of the llm.
Add a reply
Sign up and join the conversation on Discord