Find answers from the community

Updated 6 months ago

i found in code BUILDER_LLM = OpenAI(

i found in code BUILDER_LLM = OpenAI(model="gpt-4-1106-preview") fro openai but not fot hugging face
W
F
13 comments
yes you'll have to make changes in this file to replace llm with HF llm
con you help me with an example?
thanik you very much whitefang
Please, i have my how lama.cpp url and i whant to use it inseatd gpt-4-1106-preview which i found in code
You can take a look into how LlamaCpp is setup:

Plain Text
BUILDER_LLM = LlamaCPP(
    # You can pass in the URL to a GGML model to download it automatically
    model_url=model_url,
    # optionally, you can set the path to a pre-downloaded model instead of model_url
    model_path=None,
    temperature=0.1,
    max_new_tokens=256,
    # llama2 has a context window of 4096 tokens, but we set it lower to allow for some wiggle room
    context_window=3900,
    # kwargs to pass to __call__()
    generate_kwargs={},
    # kwargs to pass to __init__()
    # set to at least 1 to use GPU
    model_kwargs={"n_gpu_layers": 1},
    # transform inputs into Llama2 format
    messages_to_prompt=messages_to_prompt,
    completion_to_prompt=completion_to_prompt,
    verbose=True,
)
I think this should work
ModuleNotFoundError: No module named 'llama_module'
i don't understand
but thank you very much the way seems good
but i have installed it
NameError: name 'LlamaCPP' is not defined
You need to follow the above shared link to setup llamacpp properly.
Add a reply
Sign up and join the conversation on Discord