Find answers from the community

Updated 3 months ago

Hi! I am setting up llama_index for

Hi! I am setting up llama_index for local development and can't get the import of local modules to resolve.

I followed the instructions at https://github.com/run-llama/llama_index/blob/79fce27802943b8eb093f8cb37d278a3b4a1e213/CONTRIBUTING.md and https://docs.llamaindex.ai/en/stable/getting_started/installation.html#installation-from-source, and all works fine up until then.

But then I can't get the import of the module (e.g. OpenAI) from the package that I installed locally with pip install -e llama-index-integrations/llms/llama-index-llms-openai to work. I tried both the usual import for installed packages (e.g. from llama_index.llms.openai import OpenAI) and a relative import from the local directory (with importlib.import_module), but I can't get anything to resolve.

Probably something simple I am missing here ...? Any help would be appreciated πŸ™
W
s
H
5 comments
Did you had llama-index previously installed in your current env?
If so I would suggest you remove it first
Plain Text
pip uninstall llama-index
pip install llama-index


By installing llama-index , openAI is installed by default.
You can try importing it then like this
from llama_index.llms.openai import OpenAI
I ran into unresolving imports yesterday after migrating over to core.

For some reason with vscode when I selected a different interpreter and the reslected the one I wanted it started recognizing them. Not sure if that's your issue but might want to try that. That wouldn't be llama-index fault though.
Maybe a vscode bug/oddity. I was coding over SSH though so maybe something got lost in translation and the selecting/reselecting of the interpreter cleaned it up
You messed with the golden rule of coding:
If it is working do not touch it!! πŸ˜†
That worked! Thanks @WhiteFang_Jr.

I did have llama-index installed before, but in a different virtualenv. I ran pip install llama-index and even though all requirements were already satisfied, something must have changed, because from llama_index.llms.openai import OpenAI works now and pulls the module from the local dev environment.
Thanks again for your help!
Add a reply
Sign up and join the conversation on Discord