Find answers from the community

Updated 9 months ago

Prompts Sent To Engine

do i have a way to see the prompts that are being sen with the chatEngine? doing getPrompts() only reveal the systemContextPrompt
i
s
L
5 comments
what about:

import llama_index.core
llama_index.core.set_global_handler("simple")

https://docs.llamaindex.ai/en/stable/module_guides/observability/observability.html
i use llamats, does not seems to work
Depending on the chat engine, that's likely the only prompt
yeah? only the systemPrompts? should there not be a synthesizer?
like I mentioned, depends on the chat engine

A context chat engine only uses a retriever, then sends that + the chat history to the LLM
Add a reply
Sign up and join the conversation on Discord