Find answers from the community

Updated 4 months ago

Anyone know why I keep getting a from

At a glance

The community member is experiencing an issue with the llama_index.llms module, where they are getting an OpenAIModuleNotFoundError on their virtual machine (VM) but not on their local development environment (VS). Other community members suggest checking the installed version of the llama-index package and upgrading it if necessary. The issue appears to be related to an outdated version of the package, and the solution is to upgrade it using pip install --upgrade llama-index.

Useful resources
Anyone know why I keep getting a from llama_index.llms import OpenAI
ModuleNotFoundError: No module named 'llama_index.llms' on my vm but works fine on in vs
L
n
11 comments
Do you have a file or folder named llama_index in your working directory?
Otherwise, double check the installed version: pip show llama-index
I have an index.py and index.js but thats it
I ran into the same problem on vs and then i changed the interpretor to my cd in a venv and that worked
but now that I need to implement my bot to run live on my vm 24/7 this error has me at a standstill
Name: llama-index
Version: 0.6.32
Summary: Interface between LLMs and your data
Home-page: https://github.com/jerryjliu/llama_index
Author: Jerry Liu
Author-email: None
License: MIT
Location: /home/nicholas_demiceli/.local/lib/python3.9/site-packages
Requires: pandas, langchain, tenacity, urllib3, sqlalchemy, numpy, typing-inspect, fsspec, tiktoken, openai, dataclasses-json, typing-extensions
Required-by: llama-hub
Oh you have a really old version
pip install --upgrade llama-index
can't believe the oversight on that. Thank you!
If you are into investing I am creating a server around my company. Ill tell you more about it if you're interested but if not no worries:)
Add a reply
Sign up and join the conversation on Discord