Find answers from the community

Updated 2 months ago

Storage

Hello i have a question about StorageContext, there is some way to use a local StorageContext and not the default that use OpenAI Key
Attachment
image.png
W
S
5 comments
Yes, define service context with llm and embed model and set it as global.

It'll then use the models defined in service context for your operation with storage
Plain Text
from llama_index import set_global_service_context
service_context = ServiceContext.from_defaults(
    llm=None,
    embed_model='local'
)


set_global_service_context(service_context)


# do storage  operation from here
And llm not required to pass anywhere then
Thanks WhiteFang it works!
Another aditional question, there is some way to improve the speed of the program when i use the global service context?
Add a reply
Sign up and join the conversation on Discord