is set_global_service_context still required? when i tried swapping to a local LLM it didnt like (wanted Settings.embed_model) the call with a service context on the vector store, but now after swapping back to azureopenai it requires the service context, im not sure i understand why there is a difference
I cant give the full complete example now because i uninstalled the entire llamaindex env and reinstalled it and it just worked.... but basically it was simply i upgraded from as you said 0.9 to 0.10 and had to change all of the deprecated stuff, then i had: