Find answers from the community

e
enoch
Offline, last seen 3 months ago
Joined September 25, 2024
e
enoch
·

Logging

Feel like this is very basic, but cant quite get it to work - how can I see exactly the prompt that is actually sent to the LLM - that is, the exact system message and user message wth all the docs filled in, etc. Something is going wrong and I need to debug it and i wan tto start by just verifying the exact string sent to the LLM
15 comments
W
L
e
Is there anyway to chat directly with dosu bot on GitHub? Or any rag chat over the llamaindex docs generally?
5 comments
L
e
T
I see something like node.excluded_llm_metadata_keys = ["vector"] will work for that one node... but I wan to this to apply to all nodes coming from OS at all times. and I dotn want to blacklist like this, Id like to whitelist. Something like; forallnodes.included_llm_metadata_keys = None without the loop
8 comments
L
e