Find answers from the community

Updated last year

Prompts Sent To Engine

At a glance

The community member is asking if there is a way to see the prompts being sent with the chatEngine, as the getPrompts() method only reveals the systemContextPrompt. The comments suggest that the prompts visible may depend on the specific chat engine being used. One community member mentions using the llama_index library and its observability features, but another says it does not seem to work for them. The comments indicate that for some chat engines, the systemPrompt may be the only prompt visible, and that the chat engine may only use a retriever and send that along with the chat history to the language model. However, there is no definitive answer provided in the comments.

Useful resources
do i have a way to see the prompts that are being sen with the chatEngine? doing getPrompts() only reveal the systemContextPrompt
i
s
L
5 comments
what about:

import llama_index.core
llama_index.core.set_global_handler("simple")

https://docs.llamaindex.ai/en/stable/module_guides/observability/observability.html
i use llamats, does not seems to work
Depending on the chat engine, that's likely the only prompt
yeah? only the systemPrompts? should there not be a synthesizer?
like I mentioned, depends on the chat engine

A context chat engine only uses a retriever, then sends that + the chat history to the LLM
Add a reply
Sign up and join the conversation on Discord