I would like to use model from the local directory, mistral for instance, According to this source:
https://gpt-index.readthedocs.io/en/latest/examples/llm/llama_2_llama_cpp.html I can use it from url or path. LLamaCPP download the model in /tmp/llama_index/models/. If I put model path, will it be recognized before so the model doesn't have to be downloaded? Secondly, how to change the path of the model url from tmp to desired directory?
llm = LlamaCPP(
# You can pass in the URL to a GGML model to download it automatically
model_url=model_url,
# optionally, you can set the path to a pre-downloaded model instead of model_url