Find answers from the community

s
F
Y
a
P
Updated last month

Logan M i am using Google Vertex AI API

i am using Google Vertex AI API as LLM and using HF embedding. After completing the embedding and the file stored on local drive. i try to retrive those file using following instructions:

rebuild storage context

storage_context = StorageContext.from_defaults(persist_dir='/content/drive/MyDrive/data/vectordb')

load index

index = load_index_from_storage(storage_context=storage_context)

and got follow error:
---------------------------------------------------------------------------
ValidationError Traceback (most recent call last)
<ipython-input-12-a5b638486930> in <cell line: 4>()
2 storage_context = StorageContext.from_defaults(persist_dir='/content/drive/MyDrive/data/vectordb')
3 # load index
----> 4 index = load_index_from_storage(storage_context=storage_context)

7 frames
/usr/local/lib/python3.10/dist-packages/pydantic/main.cpython-310-x86_64-linux-gnu.so in pydantic.main.BaseModel.init()

ValidationError: 1 validation error for OpenAI
root
Did not find openai_api_key, please add an environment variable OPENAI_API_KEY which contains it, or pass openai_api_key as a named parameter. (type=value_error)

How i can setup load_index_from_storage and not using OPENAI as default ? thx.
W
a
6 comments
Hi @autratec ,
Your service context needs to have the llm_predictor object pointing to Google Vertex class if Llama-Index has support for it.
Yea. I did it. there is no issue if I directly use the index after index file being generated in memory. But facing issue if I save the file locally and load them again. I feel there is a bug in that load_index_from_storage and use openai as default and didn't consider other llm and embedding tool will be used.
@autratec you need to pass the service context while loading up the indexes as well if you are not using default settings
Otherwise if you look at the code it creates new service context if not passed using default values that is OpenAI
You can set your service context as global then you won't have to pass it anywhere and it will pick your defined service_context everywhere
you are correct. after i changing the code as: index = load_index_from_storage(storage_context=storage_context,service_context=service_context) it works now. thanks.
Add a reply
Sign up and join the conversation on Discord