hmmm thats an interesting way of installing on top of llama-index with cli command or the python code provided? π€
First does not work when building an Dockerimage.
8.227 download_llama_pack(
8.227 File "/opt/conda/lib/python3.10/site-packages/llama_index/core/llama_pack/download.py", line 56, in download_llama_pack
8.227 pack_cls = download_llama_pack_template(
8.227 File "/opt/conda/lib/python3.10/site-packages/llama_index/core/download/pack.py", line 126, in download_llama_pack_template
8.227 spec.loader.exec_module(module) # type: ignore
8.227 File "<frozen importlib._bootstrap_external>", line 879, in exec_module
8.227 File "<frozen importlib._bootstrap_external>", line 1016, in get_code
8.227 File "<frozen importlib._bootstrap_external>", line 1073, in get_data
8.227 FileNotFoundError: [Errno 2] No such file or directory: '/app/chain_of_table_pack/llama_index/packs/tables/base.py'
Second method I would like to avoid.