Log in
Log into community
Find answers from the community
View all posts
Related posts
Was this helpful?
π
π
π
Powered by
Hall
Inactive
Updated 11 months ago
0
Follow
Prompts Sent To Engine
Prompts Sent To Engine
Inactive
0
Follow
At a glance
s
shakedbuk
11 months ago
Β·
do i have a way to see the prompts that are being sen with the chatEngine? doing getPrompts() only reveal the systemContextPrompt
i
s
L
5 comments
Share
Open in Discord
i
iach
11 months ago
what about:
import llama_index.core
llama_index.core.set_global_handler("simple")
https://docs.llamaindex.ai/en/stable/module_guides/observability/observability.html
s
shakedbuk
11 months ago
i use llamats, does not seems to work
L
Logan M
11 months ago
Depending on the chat engine, that's likely the only prompt
s
shakedbuk
11 months ago
yeah? only the systemPrompts? should there not be a synthesizer?
L
Logan M
11 months ago
like I mentioned, depends on the chat engine
A context chat engine only uses a retriever, then sends that + the chat history to the LLM
Add a reply
Sign up and join the conversation on Discord
Join on Discord