Just a quick question, if we create a ServiceContext.from_defaults() function to pass in a different sentencetransformers model, will VectorStoreIndex.from_documents() function create the embedding data using said different model when passing in documents and that ServiceContext object? specifically using chromadb
Yes if you want to use different embed model other than default OpenAI one then you need to pass it in the service context and all the embeddings will be created using the newly defined embedding model.
I couldn't tell if llama index was using the embed model passed into service_context or if it was using default chromadb embed model in storage_context (if the chromaVectorStore was built without passing in embedding model)
You could try putting your service context as global. That way if in storage context it checks for service context then it will take your globally defined one.
Rest I think it will be better to check the code to verify if it uses the defined model or uses itw own.