Find answers from the community

Home
Members
JefeDelTodos
J
JefeDelTodos
Offline, last seen 3 months ago
Joined September 25, 2024
I've run into an error and haven't been able to figure it out. I'm following this to setup llama2 via huggingface https://colab.research.google.com/drive/14N-hmJ87wZsFqHktrw40OU6sVcsiSzlQ?usp=sharing#scrollTo=lMNaHDzPM68f. When I get to this line of code:
Plain Text
llm = HuggingFaceLLM(
    model_name="meta-llama/Llama-2-7b-chat-hf",
    tokenizer_name="meta-llama/Llama-2-7b-chat-hf",
    query_wrapper_prompt=PromptTemplate("<s> [INST] {query_str} [/INST] "),
    context_window=3900,
    model_kwargs={"token": hf_token, "quantization_config": quantization_config},
    tokenizer_kwargs={"token": hf_token},
    device_map="auto",
)

I get the following error: validation error for HuggingFaceLLM system_prompt none is not an allowed value (type=type_error.none.not_allowed) And I haven't been able to figure it out. (I'm a complete newbie here and this is my first time going through the llamaindex documentation, etc). Has anybody run into this before? I'm running on windows 11 WSL Ubuntu.
6 comments
J
L