Find answers from the community

Updated last month

Welcome message

At a glance
Happy new year. I recently fine-tune a model. I used Ollama to run it. It shows a welcome message in Ollama terminal.
In my python code, I use
llm = Ollama(model="xxxx", request_timeout=60.0)
chat_engine =index.as_chat_engine(
chat_mode="context",
llm=llm,
memory=memory
)
How to get welcome message generated by the model when it starts?
L
M
9 comments
Not sure what you mean by welcome message? You can just print() any message you want?
The first img is llama3.2, the second one is our fine-tuned model.

The welcome message is "Hi my name is ....."

I don't know how to get the welcome message in python
Attachments
0.png
0.png
Thats just a matter of setting a system prompt when you chat with the llm

For example
Plain Text
index.as_chat_engine(..., system_prompt="Your name is Paul, always greet yourself when you start a new conversation.")
system prompt is in system, in the fourth row... message is in the 3rd row... the messages is added into the language model.. Can python extract it from the model?
I don't think that system message is used by default in Ollama + llama-index
You need to provide your own
Set the system prompt on the chat engine imo
Add a reply
Sign up and join the conversation on Discord