Find answers from the community

Updated 10 months ago

How to import StorageContext in latest llama_index?

@kapa.ai How to import StorageContext in latest llama_index?
k
L
L
24 comments
Please stand by @Luke Skywalker as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
To import StorageContext in the latest version of LlamaIndex, you can use the following line of code:

Plain Text
from llama_index import StorageContext


This class is part of the core abstractions provided by LlamaIndex for the storage of Nodes, indices, and vectors. You can learn more about it in the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
from llama_index.core import StorageContext
if you have import errors with that, make sure you have a completely fresh venv
the switch to namespaced packages means any remnants of a previous install cause issues
thx. its in a container no venv
Plain Text
from llama_index.llms.vllm import VllmServer
extraction  | ImportError: cannot import name 'VllmServer' from 'llama_index.llms.vllm' (/opt/conda/lib/python3.10/site-packages/llama_index/llms/vllm/__init__.py)

Something went wrong with package installation or do I need to import it from somewhere else as well?
pip install llama-index-llms-vllm
hmm let me check
looks like it got missed in the init file
You can do from llama_index.llms.vllm.base import VllmServer
will make a fix to add that to the init file πŸ™‚
@Logan M
Plain Text
ERROR: Cannot install -r ./requirements.txt (line 14), -r ./requirements.txt (line 17) and llama-index-core because these package versions have conflicting dependencies.
11.36 
11.36 The conflict is caused by:
11.36     The user requested llama-index-core
11.36     llama-index-llms-vllm 0.0.1 depends on llama-index-core<0.10.0 and >=0.9.32
11.36     llama-index-embeddings-text-embeddings-inference 0.1.1 depends on llama-index-core<0.11.0 and >=0.10.1
11.36     The user requested llama-index-core
11.36     llama-index-llms-vllm 0.0.1 depends on llama-index-core<0.10.0 and >=0.9.32
11.36     llama-index-embeddings-text-embeddings-inference 0.1.0 depends on llama-index-core==0.10.0

πŸ€”
requirements.txt
Plain Text
llama-index-llms-vllm
llama-index-core
llama-index
llama-index-embeddings-text-embeddings-inference
llama-index-llms-vllm 0.0.1 depends on llama-index-core<0.10.0 and >=0.9.32 well this is just not true
Attachment
image.png
I wonder why its not finding v0.1.1.
maybe try explicitly asking for it?
pip install llama-index-llms-vllm==0.1.1
issue lays in the packages needing differnt llama-index-core versions
solution seems to be install llama-index-embeddings-text-embeddings-inference first and then the other libraries πŸ€·β€β™‚οΈ
As pointed out above, that was fixed in v0.1.1 of vllm, which as "^0.10.1" as a requirement. So I don't think its technically an issue anymore, although no idea why your pip install didn't see v0.1.1. of vllm
Add a reply
Sign up and join the conversation on Discord