Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated 9 months ago
0
Follow
Prompts Sent To Engine
Prompts Sent To Engine
Inactive
0
Follow
s
shakedbuk
9 months ago
ยท
do i have a way to see the prompts that are being sen with the chatEngine? doing getPrompts() only reveal the systemContextPrompt
i
s
L
5 comments
Share
Open in Discord
i
iach
9 months ago
what about:
import llama_index.core
llama_index.core.set_global_handler("simple")
https://docs.llamaindex.ai/en/stable/module_guides/observability/observability.html
s
shakedbuk
9 months ago
i use llamats, does not seems to work
L
Logan M
9 months ago
Depending on the chat engine, that's likely the only prompt
s
shakedbuk
9 months ago
yeah? only the systemPrompts? should there not be a synthesizer?
L
Logan M
9 months ago
like I mentioned, depends on the chat engine
A context chat engine only uses a retriever, then sends that + the chat history to the LLM
Add a reply
Sign up and join the conversation on Discord
Join on Discord