The community member is trying to load an index from a local model (zephyr) but is encountering an error asking for an OpenAI API key. They saw in the documentation that they need to pass a custom service_context during the load, but are unsure how to do it. Other community members suggest using a global service context and passing the service_context as a keyword argument to the load_index_from_storage function. The original community member confirms that the issue was caused by not including the service_context= when calling the function, and the problem is now resolved.
Hi guys, I'm currently trying to load an index from one I created using a local model (zephyr), however when using load_index_from_storage I get an error asking for the OpenAI API key. I saw on the storing documentation (https://docs.llamaindex.ai/en/stable/understanding/storing/storing.html#persisting-to-disk) that I needed to pass my custom service_context during the load but unsure of how to do it (it's not allowed as an argument). Any help would be appreciated, thanks!