Find answers from the community

Updated last year

Has anyone tried to load the LLaMA 2

Has anyone tried to load the LLaMA 2 model via HuggingFaceLLM instead of Replicate? Im also trying to do this on Azure Databricks
L
c
4 comments
Instead of manually downloading the checkpoints, can I directly pull them with HuggingFaceLLM like model = ‘meta-llama/Llama-2-13b-chat-hf’?

Like here:
https://thealgorithmicminds.com/how-to-use-huggingface-to-use-llama-2-on-your-custom-machine-35713a2964de
As long as you have access from huggingface, yea I think so

You can create the model and tokenzier like that example, and then pass them in as kwargs to the huggingface llm. The example in the link above will download them automatically
Add a reply
Sign up and join the conversation on Discord