The community member is asking if there is a way to see the prompts being sent with the chatEngine, as the getPrompts() method only reveals the systemContextPrompt. The comments suggest that the prompts visible may depend on the specific chat engine being used. One community member mentions using the llama_index library and its observability features, but another says it does not seem to work for them. The comments indicate that for some chat engines, the systemPrompt may be the only prompt visible, and that the chat engine may only use a retriever and send that along with the chat history to the language model. However, there is no definitive answer provided in the comments.