Ok, seems like I made the situation even worse by trying to uninstall and then reinstall llama-index
pip uninstall llama-index
Now I run into an issue reinstalling llama-index.
pip install llama_index==0.10.9
got llama_index reinstalled but no luck installing any version after 0.10.10, so that's at least a ok sign? But still having the llama_index.core.llms.base issue.
which documentation example are you following?
I think this is a chain problem.
The onnxruntime installed is too old and I'm having a hard time updating that, then this leads to llama-index not being able to upgrade to the newest version.
Ah yeah, it must have been left by mistake
Correct import would
from llama_index.core.llms import ChatMessage, ChatResponse
in Agent/openai/step.py
?
Did you mean to respond to another post?
I'm confused.
Wait the import is correct on repo, How did you upgrade?
No no this only, You were getting import issue
Try doing this:
pip uninstall llama-index
pip install llama-index --upgrade --no-cache-dir --force-reinstall
onxruntime version is too old, but my pip couldn't find the newest version.
You are installing the latest version right?
latest version for onxruntime or llama-index?
Yes, I've tried to
pip install llama-index==0.10.13
Same issue.
onxruntime causes llama-index-vector-stores-chroma not able to install, then llama-index-cli fails, then llama-index fails.
any version higher than 0.10.9 for llama-index fails.
I assume the old version of llama-index is causing the No module named 'llama_index.core.llms.base'
Feels like a rabbit hole...
let me check installing on colab
Ok, I guess it's something on my local that's really messed up...
It's so weird, it was literally working last night.
Try with a new env I recommend this highly
And install like this in the new env
pip install llama-index --upgrade --no-cache-dir --force-reinstall
Yeah, I've been testing my luck too much with just the local env not using virtualenv
So instead of
from llama_index.core.llms import ChatMessage, ChatResponse
It's
llama_index.core.base.llms.types import ChatMessage, ChatResponse
now?
Create a new env and I'm pretty sure this import issue will go away
Dumb question,
Is the new env created based on the current local env? Or is it a fresh start?
if you'll add no-cache-dir it will not pick any existing library as cache
pip install llama-index --upgrade --no-cache-dir --force-reinstall
Ok, got it activated and it's installing lots of things, so it's a fresh start.
Well, that didn't work either.
It default installed 0.10.9 for LlamaIndex.
Then I try to reinstall it with llama-index 0.10.13, it failed on onxruntime again.
@Logan M Maybe you can take a look?
Now it's even more weird that both my local and venv both faill on the same issue.
I was able to install onnxruntime 1.17.1 using
SYSTEM_VERSION_COMPAT=0 pip install --no-cache-dir "onnxruntime>=1.17.1"
But still having issues upgrading llama-index to any version that's newer than 0.10.9
it should install latest version, are you specifying version?
pip install llama-index==0.10.13 --upgrade --no-cache-dir --force-reinstall
no need to mention the version name
pip install llama-index --upgrade --no-cache-dir --force-reinstall
wierd, it got installed easily on colab
Colab is different from Mac so not sure if that plays a part.
Given my pip couldn't find onnxruntime without me forcing the System version compat.
Also, doesn't seem like I'm the only one with this issue.
yeah there were some breaking change due to same namespace if you upgrade from 0.9 to 0.10
but i did not get any issue in a new env, but i have only tried this in colab and windows
so might me mac may require something
Yeah, I had to deal with the upgrade earlier this month.
Very frustrating since I have to deal with it again.
Ok, getting deeper into the rabbit hole, seems like
llama-index-core 0.10.13 and llama-index-vector-stores-chroma 0.1.4 were having conflicts.
I resolved it by forcing to use SYSTEM_VERSION_COMPAT=0
Now llama-index == 0.10.13.post1 is installed but still having the issue
ModuleNotFoundError: No module named 'llama_index.core.llms.base'
can you give the full error
For code
I used to have
from llama_index.core.llms import ChatMessage, ChatResponse
Now switched to
from llama_index.core.base.llms.types import ChatMessage, ChatResponse
Error
We are back to square 1, it's the same error that I posted it on very top.
Lol, yea but I'm still not sure if this is the latest version code
pip freeze | grep llama-index
I haven't modified the code, solely trying to fix all the dependencies issue.
you created the new env right?
can you do this in that env
pip show llama-index
Two envs are activated on your code
I did have to use SYSTEM_VERSION_COMPAT=0 to force llama-index-core 0.10.13, llama-index-vector-stores-chroma 0.1.4 so llama-index 0.10.13 can be installed.
(.venv) (base) means two envs are activated???
do you also have conda installed?
Yeah, conda is installed but I'm not using it.
Ok, now I fee like an idiot.
should I do conda deactivate
first?
Okay I think i got the problem
Check the SS, I have a env called llama
so if you check the path mentioned it says C:\Users\asharma\anaconda3\envs\**llama**\lib\site-packages\llama_index\core\base\llms\types.py
but if we see your ss, it does not mention the env at all in your error
oh shoot, you are right, I do see anaconda in my error code, but not my virtual environment.
source deactivate
This should deactivate the conda env
Yep, we are on the same page,
It's actually
fo the new version of conda.
See if this deactivates your env
Yep, I'm out of conda and just reactivate my venv.
Also how are you testing? using debugger? or just launching a server?
OMG, it's WORKING NOWWWWWWW
I cannot believe it literally took me whole morning.
Thank you so much for following the thread and helping out.
But a couple of things that might be helpful,
- Anaconda's env is preventing llama-index-core 0.10.13 and llama-index-vector-stores-chroma 0.1.4 to be installed at the same time for some weird reason.
- Anaconda's env also prevents onnxruntime 1.17.1 to be installed, which causes llama-index-vector-stores-chroma not able to install, then llama-index-cli and llama-index-core fails, then any version higher than 0.10.9 of llama-index fails to be installed
- SYSTEM_VERSION_COMPAT=0 is super useful.
- Use Vertual Environment!!!
Also just wrote the step by step solutions here.