Find answers from the community

Updated 5 months ago

Hi Everyone, is there any example

At a glance

The community member is asking for an example on how to use the Mozilla/Meta-Llama-3.1-8B-Instruct-llamafile model in the LlamaIndex. Other community members provide some guidance, suggesting to follow the documentation on running the llamafile model and providing sample code using the ChatMessage class. However, there is no explicitly marked answer, and the community members are still trying to figure out how to get the model working with LlamaIndex.

Useful resources
Hi Everyone, is there any example availalble on how to use Mozilla/Meta-Llama-3.1-8B-Instruct-llamafile https://huggingface.co/Mozilla/Meta-Llama-3.1-8B-Instruct-llamafile, Llamafile Instruct model. I see there is some prompt guidance. How do I use it in the LlamaIndex with Llamafile. Is there any example available.
W
S
6 comments
so using this from llama_index.core.llms import ChatMessage

messages = [
ChatMessage(
role="system",
content="Pretend you are a pirate with a colorful personality.",
),
ChatMessage(role="user", content="What is your name?"),
]
resp = llm.chat(messages)

should solve the purpose for
Plain Text
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
{{prompt}}<|eot_id|>{{history}}<|start_header_id|>{{char}}<|end_header_id|>

Because if I run using example https://docs.llamaindex.ai/en/stable/getting_started/starter_example_local/ and use the instruct llamafile then I don't get any output
It should work as shown here:
is your llm running?
Add a reply
Sign up and join the conversation on Discord