Find answers from the community

Updated 3 months ago

model path

I would like to use model from the local directory, mistral for instance, According to this source: https://gpt-index.readthedocs.io/en/latest/examples/llm/llama_2_llama_cpp.html I can use it from url or path. LLamaCPP download the model in /tmp/llama_index/models/. If I put model path, will it be recognized before so the model doesn't have to be downloaded? Secondly, how to change the path of the model url from tmp to desired directory?

llm = LlamaCPP(
# You can pass in the URL to a GGML model to download it automatically
model_url=model_url,
# optionally, you can set the path to a pre-downloaded model instead of model_url
E
1 comment
You can pass a model_path argument to LlamaCPP, then will solve your both problems, will not be downloaded and will get from your path
Add a reply
Sign up and join the conversation on Discord