create-llama
app and there's an issue where the python fastapi server just hangs. i've narrowed it down to here:chat_engine = index.as_chat_engine() print(f"sending") response = chat_engine.stream_chat(lastMessage.content, messages) print("received")