Find answers from the community

Updated 9 months ago

When I try to run llamaindex-cli I get

When I try to run llamaindex-cli I get

Plain Text
$ llamaindex-cli-tool.exe upgrade file.py 
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "...\venv\Scripts\cli-tool.exe\__main__.py", line 4, in <module>
  File "...\venv\Lib\site-packages\package\cli\command_line.py", line 4, in <module>
    from package.cli.module import ModuleCLI, default_modulecli_persist_dir
  File "...\venv\Lib\site-packages\package\cli\module\__init__.py", line 1, in <module>
    from package.cli.module.base import ModuleCLI, default_modulecli_persist_dir
  File "...\venv\Lib\site-packages\package\cli\module\base.py", line 9, in <module>
    from package.core import (
ImportError: cannot import name 'SimpleDirectoryReader' from 'package.core' (unknown location)


Here's some pip freeze output, for reference:

Plain Text
llama-hub==0.0.79.post1
llama-index==0.10.14
llama-index-agent-openai==0.1.5
llama-index-cli==0.1.7
llama-index-core==0.10.14.post1
llama-index-embeddings-openai==0.1.6
llama-index-indices-managed-llama-cloud==0.1.3
llama-index-legacy==0.9.48
llama-index-llms-openai==0.1.7
llama-index-multi-modal-llms-openai==0.1.4
llama-index-program-openai==0.1.4
llama-index-question-gen-openai==0.1.3
llama-index-readers-file==0.1.6
llama-index-readers-llama-parse==0.1.3
llama-index-vector-stores-chroma==0.1.5
llama-parse==0.3.5
llamaindex-py-client==0.1.13
L
s
19 comments
uhhhh that is weird
Maybe try with a fresh venv?
Earlier your migration guide was blocked by our firewall but now I can read it and I see what might cause the issue. Fresh venv should fix that.

Plain Text
Since v0.10 makes use of Python namespace packages, and v0.9.xx or earlier does not, any remnants of legacy llama-index will surely break import paths and cause errors.
Should also make sure I didn't install llama-index globally
yup, definitely!
Ok, so fresh venv makes the tool run, but

  1. it failed on a weird character and didn't tell me where in my code it failed.
  2. it added core in some places and those imports still look broken. If there is something else that needs to be installed it should tell me about that or something.
Plain Text
Module not found: BaseEmbedding as llama_index_BaseEmbedding
Switching to core
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "...\venv2\Scripts\llamaindex-cli.exe\__main__.py", line 7, in <module>
  File "...\venv2\Lib\site-packages\llama_index\cli\command_line.py", line 269, in main
    args.func(args)
  File "...\venv2\Lib\site-packages\llama_index\cli\command_line.py", line 227, in <lambda>
    upgrade_parser.set_defaults(func=lambda args: upgrade_dir(args.directory))
                                                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "...\venv2\Lib\site-packages\llama_index\cli\upgrade\base.py", line 287, in upgrade_dir
    upgrade_file(str(file_ref))
  File "...\venv2\Lib\site-packages\llama_index\cli\upgrade\base.py", line 271, in upgrade_file
    upgrade_py_md_file(file_path)
  File "...\venv2\Lib\site-packages\llama_index\cli\upgrade\base.py", line 253, in upgrade_py_md_file
    lines = f.readlines()
            ^^^^^^^^^^^^^
  File "...\Python\Python311\Lib\encodings\cp1252.py", line 23, in decode
    return codecs.charmap_decode(input,self.errors,decoding_table)[0]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
UnicodeDecodeError: 'charmap' codec can't decode byte 0x81 in position 7704: character maps to <undefined>
it added core in some places and those imports still look broken. If there is something else that needs to be installed it should tell me about that or something. -- its supposed to print it at the end lol but I guess it broke before it got there
Are you running on a notebook? Some notebooks have weird contents that aren't easy to parse
You can run one file at a time with llamaindex-cli upgrade-file <file path>

tbh I'm also happy to just help you update imports lol
It follows a pretty distinct pattern overall
not a notebook, just code. I'll get there in the end, I think. Looking at the page https://pretty-sodium-5e0.notion.site/ce81b247649a44e4b6b35dfb24af28a6?v=53b3c2ced7bb4c9996b81b83c9f01139&p=00edf0d6571040ab83b69e2d08fe1d77&pm=s

llama-index-llms-azure-openai
is not listed, and the versions there seem out of date. Is there a fuller list?
Most of my progress is made by querying llama-index repo with the name of the functions I want to import.
The upgrade tool added core to all my azure open ai stuff which seems to not be in core, so it wasn't that useful there
I had this:
Plain Text
from llama_index.llms import AzureOpenAI as llama_index_AzureOpenAI
from llama_index.embeddings import AzureOpenAIEmbedding as llama_index_AzureOpenAIEmbedding

upgrade tool did this:
Plain Text
from llama_index.core.llms import AzureOpenAI as llama_index_AzureOpenAI
from llama_index.core.embeddings import AzureOpenAIEmbedding as llama_index_AzureOpenAIEmbedding


but I needed to add these to requirements.txt:
Plain Text
llama-index-llms-azure-openai==0.1.5
llama-index-embeddings-azure-openai==0.1.6

and import like this:
Plain Text
from llama_index.llms.azure_openai import AzureOpenAI as llama_index_AzureOpenAI
from llama_index.embeddings.azure_openai import AzureOpenAIEmbedding as llama_index_AzureOpenAIEmbedding
The more updated list is now on llama-hub
https://llamahub.ai/
huh weird, surprised it did that
maybe confused by the as .... syntax (i did zero handling for that haha)
Add a reply
Sign up and join the conversation on Discord