I want to build a RAG using LlamaIndex and Llama3.1 but I don't want to install and download the model every time from ollama. Is there a way to download Llama3.1 and then load it to be used with the LlamaIndex framework?
Does anyone know how to use llama3 as the LLM model in llamaindex without ollama? Can we instead use Huggingface? I don't want to use ollama because I don't know where it does save the model and how to change the saving directory.