Find answers from the community

Updated 11 months ago

Having issue with

At a glance

The community member is experiencing a ModuleNotFoundError: No module named 'llama_index.core.llms.base' issue with the llama-index library, which was working fine the previous night. They have tried uninstalling and reinstalling the library, but are still facing issues, especially with versions higher than 0.10.9. The issue seems to be related to the onnxruntime library, which is too old and causing problems with the newer versions of llama-index. The community members have tried various approaches, including creating a new virtual environment, but the issue persists. They have also checked the import path in the latest code repository and found that the correct import is llama_index.core.base.llms.types import ChatMessage, ChatResponse. However, the issue still remains unresolved.

Useful resources
Having issue with
Plain Text
ModuleNotFoundError: No module named 'llama_index.core.llms.base'

It was literally working last night and not sure what happened.
Attachment
image.png
s
E
W
97 comments
Full error log
Attachment
image.png
Ok, seems like I made the situation even worse by trying to uninstall and then reinstall llama-index
Plain Text
pip uninstall llama-index

Now I run into an issue reinstalling llama-index.
Plain Text
pip install llama_index==0.10.9
got llama_index reinstalled but no luck installing any version after 0.10.10, so that's at least a ok sign? But still having the llama_index.core.llms.base issue.
which documentation example are you following?
I think this is a chain problem.
The onnxruntime installed is too old and I'm having a hard time updating that, then this leads to llama-index not being able to upgrade to the newest version.
Ah yeah, it must have been left by mistake

Correct import would
from llama_index.core.llms import ChatMessage, ChatResponse

in Agent/openai/step.py
?
Did you mean to respond to another post?
I'm confused.
Wait the import is correct on repo, How did you upgrade?
No no this only, You were getting import issue
Try doing this:
Plain Text
pip uninstall llama-index

pip install llama-index --upgrade --no-cache-dir --force-reinstall
Still having issue.
onxruntime version is too old, but my pip couldn't find the newest version.
You are installing the latest version right?
latest version for onxruntime or llama-index?
Yes, I've tried to
Plain Text
pip install llama-index==0.10.13

Same issue.
onxruntime causes llama-index-vector-stores-chroma not able to install, then llama-index-cli fails, then llama-index fails.
any version higher than 0.10.9 for llama-index fails.
I assume the old version of llama-index is causing the No module named 'llama_index.core.llms.base'
Feels like a rabbit hole...
let me check installing on colab
I'm able to install:
Attachment
image.png
Ok, I guess it's something on my local that's really messed up...
It's so weird, it was literally working last night.
Try with a new env I recommend this highly
And install like this in the new env

Plain Text
pip install llama-index --upgrade --no-cache-dir --force-reinstall
Yeah, I've been testing my luck too much with just the local env not using virtualenv
If you see in this error: The import is like this at last line
from llama_index.core.llms.base import ChatMessage, ChatResponse

Now check the latest code on repo here:
https://github.com/run-llama/llama_index/blob/b2f0a59c21f651bea1502818ec7f61ab915ca286/llama-index-integrations/agent/llama-index-agent-openai/llama_index/agent/openai/step.py#L31C1-L31C71


See the import path
So instead of
Plain Text
from llama_index.core.llms import ChatMessage, ChatResponse

It's
Plain Text
llama_index.core.base.llms.types import ChatMessage, ChatResponse
now?
Create a new env and I'm pretty sure this import issue will go away
Dumb question,
Is the new env created based on the current local env? Or is it a fresh start?
if you'll add no-cache-dir it will not pick any existing library as cache

pip install llama-index --upgrade --no-cache-dir --force-reinstall
Ok, got it activated and it's installing lots of things, so it's a fresh start.
Well, that didn't work either.
It default installed 0.10.9 for LlamaIndex.
Then I try to reinstall it with llama-index 0.10.13, it failed on onxruntime again.
@Logan M Maybe you can take a look?
Now it's even more weird that both my local and venv both faill on the same issue.
I was able to install onnxruntime 1.17.1 using

Plain Text
SYSTEM_VERSION_COMPAT=0 pip install --no-cache-dir "onnxruntime>=1.17.1"


But still having issues upgrading llama-index to any version that's newer than 0.10.9
it should install latest version, are you specifying version?
just try with this,
I did try this, sir.
Plain Text
pip install llama-index==0.10.13 --upgrade --no-cache-dir --force-reinstall
no need to mention the version name
Plain Text
pip install llama-index --upgrade --no-cache-dir --force-reinstall
wierd, it got installed easily on colab
Colab is different from Mac so not sure if that plays a part.
Given my pip couldn't find onnxruntime without me forcing the System version compat.
Also, doesn't seem like I'm the only one with this issue.
yeah there were some breaking change due to same namespace if you upgrade from 0.9 to 0.10

but i did not get any issue in a new env, but i have only tried this in colab and windows
so might me mac may require something
Yeah, I had to deal with the upgrade earlier this month.
Very frustrating since I have to deal with it again.
Ok, getting deeper into the rabbit hole, seems like
llama-index-core 0.10.13 and llama-index-vector-stores-chroma 0.1.4 were having conflicts.

I resolved it by forcing to use SYSTEM_VERSION_COMPAT=0

Now llama-index == 0.10.13.post1 is installed but still having the issue
Plain Text
ModuleNotFoundError: No module named 'llama_index.core.llms.base'
can you give the full error
For code
I used to have
Plain Text
from llama_index.core.llms import ChatMessage, ChatResponse

Now switched to
Plain Text
from llama_index.core.base.llms.types import ChatMessage, ChatResponse


Error
Attachment
image.png
We are back to square 1, it's the same error that I posted it on very top.
Lol, yea but I'm still not sure if this is the latest version code
Plain Text
pip freeze | grep llama-index
Attachment
image.png
I haven't modified the code, solely trying to fix all the dependencies issue.
you created the new env right?
can you do this in that env

pip show llama-index
Two envs are activated on your code
I did have to use SYSTEM_VERSION_COMPAT=0 to force llama-index-core 0.10.13, llama-index-vector-stores-chroma 0.1.4 so llama-index 0.10.13 can be installed.
(.venv) (base) means two envs are activated???
do you also have conda installed?
Yeah, conda is installed but I'm not using it.
Ok, now I fee like an idiot.
should I do conda deactivate first?
Okay I think i got the problem
Check the SS, I have a env called llama so if you check the path mentioned it says C:\Users\asharma\anaconda3\envs\**llama**\lib\site-packages\llama_index\core\base\llms\types.py

but if we see your ss, it does not mention the env at all in your error
Attachment
image.png
oh shoot, you are right, I do see anaconda in my error code, but not my virtual environment.
source deactivate This should deactivate the conda env
Yep, we are on the same page,
It's actually
Plain Text
conda deactivate
fo the new version of conda.
See if this deactivates your env
Yep, I'm out of conda and just reactivate my venv.
All the best!! πŸ˜…
Also how are you testing? using debugger? or just launching a server?
OMG, it's WORKING NOWWWWWWW
I cannot believe it literally took me whole morning.
Thank you so much for following the thread and helping out.
But a couple of things that might be helpful,

  1. Anaconda's env is preventing llama-index-core 0.10.13 and llama-index-vector-stores-chroma 0.1.4 to be installed at the same time for some weird reason.
  2. Anaconda's env also prevents onnxruntime 1.17.1 to be installed, which causes llama-index-vector-stores-chroma not able to install, then llama-index-cli and llama-index-core fails, then any version higher than 0.10.9 of llama-index fails to be installed
  3. SYSTEM_VERSION_COMPAT=0 is super useful.
  4. Use Vertual Environment!!!
Also just wrote the step by step solutions here.
Add a reply
Sign up and join the conversation on Discord