so using this from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system",
content="Pretend you are a pirate with a colorful personality.",
),
ChatMessage(role="user", content="What is your name?"),
]
resp = llm.chat(messages)
should solve the purpose for
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
{{prompt}}<|eot_id|>{{history}}<|start_header_id|>{{char}}<|end_header_id|>
Because if I run using example
https://docs.llamaindex.ai/en/stable/getting_started/starter_example_local/ and use the instruct llamafile then I don't get any output