I use it here:
query_engine = index.as_chat_engine(chat_mode='context',
similarity_top_k=similarity_top_k,
llm=llm_engine,
system_prompt=prepared_system_prompt)
And because I have the env var showing the OpenAI debugging information, I can see in the terminal this:
Request options: {'method': 'post', 'url': '/chat/completions', 'files': None, 'json_data': {'messages': [{'role': 'system', 'content': '---------------------\nBelow is private information: '}, {'role': 'user', 'content': 'pricing'}], 'model': 'gpt-3.5-turbo', 'stream': False, 'temperature': 0.1}}