ServiceContext requires two models
One for response generation and other for creating embeddings.
You have only provided the llm part so it falls down to OpenAI for embedding model.
Since it is downloading the embed model on its I suspect you are using old version.
Any way you can fix it by providing the embed model.
service_context = ServiceContext.from_defaults(llm=llm, embed_model='local')