Find answers from the community

Updated 2 months ago

I have a local model working using

I have a local model working using LllamaCPP. Some questions, I'm assuming I should rebuild my index with the new LLM? The documention mentions that if you want to use local embeddings, to install sentence transformers. If I use OpenAI again later, will it use local embeddings as well?
L
b
2 comments
you only have to re-build if you switch embedding models

The same embedding model should be used to build the index + query

the LLM doesn't matter though between the two steps
Thanks for the response, @Logan M !
Add a reply
Sign up and join the conversation on Discord