Find answers from the community

s
F
Y
a
P
Updated last month

Openai key

hi, I am trying to preload an index of data from storage, and I also want the user to be able to specify their own openAI api key, but, it looks like the openAI key is needed before llama_index can successfully load a saved index, so I cant load the index until the user specifies their openAI key, and it takes like a minute to load the index. It would be a much better user experience if the index would load from storage without needing the openai key present. So If I want the user to be able to use their own key, I have to have them sit and wait for a full minute for the index to load. I understand why the API key is needed for doing the indexing, but I dont understand why it would be needed to load a saved index from storage. Is there anything that could be done about this? Thanks!
L
1 comment
The only reason it's needed is because loading the index also initialized the llm_predictor and embed_model under the hood.

Technically, you could load the index with a fake key and it would work fine... I think lol just a little hacky

Plain Text
os.environ["OPENAI_API_KEY] = "FAKE"

index = load_index_from_storage(...)

# re initialize models with proper key
index.service_context llm_predictor = LLMPredictor(llm=ChatOpenAI(..., openai_api_key="real"))
index.sevice_context.embed_model = OpenAIEmbedding(openai_api_key="real")
Add a reply
Sign up and join the conversation on Discord