Find answers from the community

Updated yesterday

Version

when installing "pip install llama-index-embeddings-huggingface" , i ran into conflict of version issues. Tried to installed less than <0.1.3 package version as well , ran into another issue not sure. Just trying to do the "https://github.com/run-llama/python-agents-tutorial/blob/main/3_rag_agent.py" with non open AI embeddings as i am running llamindex locally, can any one guide me ?
L
r
12 comments
You probably want to change that to <0.3.0 for huggingface embeddings, or you will need to adjust the version of llama-index-core you have
I would leave the version off, and let pip or poetry figure out the working version for your current setup
when i leave it for pip to handle the versions , it ends up with version conflict and asking to specify the version or remove the versions to resolve conflict. Just now tried "pip install llama-index-embeddings-huggingface "package<0.3.0"" another subprocess exited with error message
you'll have to give me your list of reqs or a way to reproduce
installs fine for me
This the error I get when i try to use HuggingFace Embeddings in my environment. You woukd like to get "pip list" ? will that help ? I am trying to follow this "https://docs.llamaindex.ai/en/stable/understanding/agent/" example and instead of openAI, i am using my own LLM model and hugging face embedding which is when i ran into this issue. https://github.com/run-llama/python-agents-tutorial/blob/main/3_rag_agent.py is the example i got into issue. 1_basic_agent.py and 2_local_agent.py agent code worked fine .
Attachment
Screenshot_2024-11-17_at_8.58.29_PM.png
Seems like an issue in your torch version?

Works fine for me locally, as well as on google colab
https://colab.research.google.com/drive/1B5sh65OvbEYBMl5oqvmZVgcxhD2F0ioI?usp=sharing
Maybe start with a fresh venv?
I can try to start a fresh venv, but how to check my Torch version?, as i did not remember me specifiying it any where . Will try on a fresh venv and keep you posted tomorrow. Thank you
I did try on a new fresh venv and got this error on embeddings and my llama version are different than yours , please take a look my screen shot it says llama-index==0.11.23 where us yours is 0.12.0, does this makes any difference ? i am also attaching poetry shell config file as well . Can we force to install certain versions of llama-index ? Another observation is when i ran "pip freeze | grep torch " nothing returned in my poetry shell, just want to share
Attachments
Screenshot_2024-11-19_at_8.30.21_PM.png
Screenshot_2024-11-19_at_8.32.33_PM.png
I don't think that's your full toml?

It would be really helpful if you could give me the full list of deps you are trying to install, or if you can reproduce in Google colab
Thank you. All i did is change my .toml to have [tool.poetry.dependencies]
python = "^3.11" instead of "3.13" , just this change and recreate the venv all went fine. Thank you , issue closed for me
Add a reply
Sign up and join the conversation on Discord