Happy new year. I recently fine-tune a model. I used Ollama to run it. It shows a welcome message in Ollama terminal. In my python code, I use llm = Ollama(model="xxxx", request_timeout=60.0) chat_engine =index.as_chat_engine( chat_mode="context", llm=llm, memory=memory ) How to get welcome message generated by the model when it starts?
system prompt is in system, in the fourth row... message is in the 3rd row... the messages is added into the language model.. Can python extract it from the model?