Find answers from the community

Updated 3 months ago

I have a weird question So I didn t

I have a weird question. So I didn't setup my API key correctly and it defaulted to use llama-13b. I was wondering if it's possible to make so that it always uses llama-13b(the one I downloaded)
L
L
12 comments
Yea you can setup the LLM like this example

https://gpt-index.readthedocs.io/en/stable/examples/llm/llama_2_llama_cpp.html#setup-llm

Then just pass it into the service context

service_context = ServiceContext.from_defaults(llm=llm, ....)
I see, thank you! One other quick question, I've figured out how to instantiate the LLM object. I could not however find documentation regarding this

messages_to_prompt=messages_to_prompt,
completion_to_prompt=completion_to_prompt,
Also in addition, I keep trying to set a model_path but I keep getting this:

"SyntaxError: (unicode error) 'unicodeescape' codec can't decode bytes in position 2-3: truncated \UXXXXXXXX escape"
Those are helper functions to transform the prompts/chat objects into something the model expects. Llama2 has very speicific requirements, so we provde helper functions for it
https://github.com/jerryjliu/llama_index/blob/e779ad994220156576b72b014b89f280fa26bc7f/llama_index/llms/llama_utils.py#L16
What does your model path look like?
ah, I figured that out, I just needed to use a double slash because it had unicode errors for path
ahh I see! Thanks Logan, you've saved my project. Question, if I wanted to just use the default openai model(as llama is taking a while to load) but also wanted to include a custom default system prompt, what would I do?
Also in addition, is there anyway I can make it give more output than what it can in a sense? I always feel like the output is always short by default. Maybe I should play with the temperature?
service_context = ServiceContext.from_defaults(..., system_prompt="Talk like a pirate")
Yea temperature may help. Or just asking it to be more thorough in responses lol
Ahh, I can probably put that in the system prompt!
Add a reply
Sign up and join the conversation on Discord