For some reason the context is not set properly in the prompt template, after every character there is a new line. The context in the prompt is set in the log like the attached txt file.
How do I output the prompt before it is send to the llm for answer generation? I want to check the format and if everything is correct before sending to llm.