Find answers from the community

Updated 4 months ago

Default Parameters for Llama2

At a glance

The post asks about the default values of parameters when using the llama2 language model, specifically temperature and context windows. A community member responds that the default value for temperature is 0 and for context_windows is 4096, at least when using the Huggingface implementation. The community member also provides an example of how to configure the llama2 model with custom parameter values, such as setting the temperature to 0.0 and disabling sampling. They also mention that the llama2 model can be run on other platforms with default or custom parameters.

Useful resources
What are the default values of parameters when using llama2 llm such as temperature,context windows
W
1 comment
I think default value for temperature is set to be as 0 and for context_windows it is 4096, In case if it is for Huggingface.

You can change the parameters as per your needs just by passing the values during llm object initialisation time.
In case of running Llama locally via HF, You can do the following:
Plain Text
llm = HuggingFaceLLM(
    context_window=4096,
    max_new_tokens=2048,
    generate_kwargs={"temperature": 0.0, "do_sample": False},
    query_wrapper_prompt=query_wrapper_prompt,
    tokenizer_name=selected_model,
    model_name=selected_model,
    device_map="auto",
    # change these settings below depending on your GPU
    model_kwargs={"torch_dtype": torch.float16, "load_in_8bit": True},
)

Find more here: https://gpt-index.readthedocs.io/en/latest/examples/vector_stores/SimpleIndexDemoLlama-Local.html

You can run Llama from other platforms as well with default parameters and with your selected parameters as well.
https://gpt-index.readthedocs.io/en/latest/examples/llm/llama_2.html#configure-model
Add a reply
Sign up and join the conversation on Discord