Find answers from the community

Updated 2 months ago

Hi guys,

Hi guys,
I'm currently trying to load an index from one I created using a local model (zephyr), however when using load_index_from_storage I get an error asking for the OpenAI API key. I saw on the storing documentation (https://docs.llamaindex.ai/en/stable/understanding/storing/storing.html#persisting-to-disk) that I needed to pass my custom service_context during the load but unsure of how to do it (it's not allowed as an argument). Any help would be appreciated, thanks!
Attachment
image.png
E
W
w
3 comments
Try passing service_context as kwarg
index=load_index_from_storage(storage_context, service_context=service_context)
Thanks both @WhiteFang_Jr @Emanuel Ferreira ! I was calling service context without service_context= which brought up the error. Now fixed👍
Add a reply
Sign up and join the conversation on Discord