Find answers from the community

Updated 4 months ago

Hi guys,

At a glance

The community member is trying to load an index from a local model (zephyr) but is encountering an error asking for an OpenAI API key. They saw in the documentation that they need to pass a custom service_context during the load, but are unsure how to do it. Other community members suggest using a global service context and passing the service_context as a keyword argument to the load_index_from_storage function. The original community member confirms that the issue was caused by not including the service_context= when calling the function, and the problem is now resolved.

Useful resources
Hi guys,
I'm currently trying to load an index from one I created using a local model (zephyr), however when using load_index_from_storage I get an error asking for the OpenAI API key. I saw on the storing documentation (https://docs.llamaindex.ai/en/stable/understanding/storing/storing.html#persisting-to-disk) that I needed to pass my custom service_context during the load but unsure of how to do it (it's not allowed as an argument). Any help would be appreciated, thanks!
Attachment
image.png
E
W
w
3 comments
Try passing service_context as kwarg
index=load_index_from_storage(storage_context, service_context=service_context)
Thanks both @WhiteFang_Jr @Emanuel Ferreira ! I was calling service context without service_context= which brought up the error. Now fixed👍
Add a reply
Sign up and join the conversation on Discord