How can I load a persisted index when I'm using a local LLM? I am using this code:
storage_context = StorageContext.from_defaults(persist_dir="../data/datastore")
index = load_index_from_storage(storage_context, llm=llm) where the llm is an instance of LLamaCPP. I get this error: Could not load OpenAI model. If you intended to use OpenAI, please check your OPENAI_API_KEY.