I have a local model working using LllamaCPP. Some questions, I'm assuming I should rebuild my index with the new LLM? The documention mentions that if you want to use local embeddings, to install sentence transformers. If I use OpenAI again later, will it use local embeddings as well?