Find answers from the community

Updated 5 months ago

Hi all, when using HuggingFaceLLM to

Hi all, when using HuggingFaceLLM to call Llama3, is there a way to dowload the model to the local, or somewhere in AWS?
L
r
3 comments
it does download the model locally already πŸ‘€
oh cool. Thank you. Is it possible to doenload the model into the AWS S3 bucket?
I think no, youd have to put the model on S3 yourself, download it, and then point the HuggingFaceLLM to the downloaded folder πŸ‘€
Add a reply
Sign up and join the conversation on Discord