I could use help getting
py2wasm
to compile. A simple test can be made by...
llama.py:
from llama_index.core.chat_engine.types import StreamingAgentChatResponse
and then:
py2wasm -o llama.wasm llama.py
I think it's getting stuck compiling pillow. Not 100% sure about it. py2wasm was just recently forked from Nuitka by the wasmer ppl. This obviously isn't normal scope but it would be a great benefit to run llama-index in edge functions/workers.
https://wasmer.io/posts/py2wasm-a-python-to-wasm-compiler