Find answers from the community

s
F
Y
a
P
Updated 12 months ago

ServicContext

Hi folks! I saved a pretty meaty index via index.storage_context.persist(), and I'm loading it with load_index_from_storage. Is there a way to change the ServiceContext on that? As I'd like to use a different LLM after indexing. Thanks!
L
a
4 comments
Yup! Yo can pass in the service context when loading

load_index_from_storage(..., service_context=service_context)

Note that you should only change the LLM, not the embed model
Thanks! I tried that, but maybe my issue is something else then -- basically, when I have DEBUG on, I can see that the system prompt is wrong (not the one I put into ServiceContext. Am I doing something silly?

service_context = ServiceContext.from_defaults(llm=OpenAI(model="gpt-4", temperature=0.5, system_prompt="My prompt"))

Thanks!
I think the system prompt kwarg goes into the service context, not the LLM constructor πŸ‘€
Ohhh, you're absolutely right, thanks!!
Add a reply
Sign up and join the conversation on Discord