StorageContext
in the latest version of LlamaIndex, you can use the following line of code:from llama_index import StorageContext
from llama_index.llms.vllm import VllmServer extraction | ImportError: cannot import name 'VllmServer' from 'llama_index.llms.vllm' (/opt/conda/lib/python3.10/site-packages/llama_index/llms/vllm/__init__.py)
ERROR: Cannot install -r ./requirements.txt (line 14), -r ./requirements.txt (line 17) and llama-index-core because these package versions have conflicting dependencies. 11.36 11.36 The conflict is caused by: 11.36 The user requested llama-index-core 11.36 llama-index-llms-vllm 0.0.1 depends on llama-index-core<0.10.0 and >=0.9.32 11.36 llama-index-embeddings-text-embeddings-inference 0.1.1 depends on llama-index-core<0.11.0 and >=0.10.1 11.36 The user requested llama-index-core 11.36 llama-index-llms-vllm 0.0.1 depends on llama-index-core<0.10.0 and >=0.9.32 11.36 llama-index-embeddings-text-embeddings-inference 0.1.0 depends on llama-index-core==0.10.0
requirements.txt
llama-index-llms-vllm llama-index-core llama-index llama-index-embeddings-text-embeddings-inference
llama-index-llms-vllm 0.0.1 depends on llama-index-core<0.10.0 and >=0.9.32
well this is just not true"^0.10.1"
as a requirement. So I don't think its technically an issue anymore, although no idea why your pip install didn't see v0.1.1. of vllm