Find answers from the community

Updated 2 months ago

anyone know if it's possible to pass in

anyone know if it's possible to pass in the metadata from the retrieved documents into context_str?
L
s
13 comments
its included by default actually
but doesn't context_str consist of retrieved_document.text?
what i want to do is ''.join(retrieved_document.metadata.field_name for retrieved_document in retrieved_documents)
Plain Text
from llama_index.schema import MetadataMode
print(document.get_content(metadata_mode=MetadataMode.LLM))
thats what gets sent to the LLM
read the link above, it shows how to customize this πŸ‘
another question: i've tried out observability tools like arize but it seems to be just showing the input and outputs. I'm wondering if there's a way to log what gets passed around in the response_synthesizer.
What are you looking to see exactly? The LLM inputs / outputs are what gets passed around in the response synthesizer πŸ€”
yea so i'm seeing that correct documents are retrieved from pinecone but for some reason the response is not what i'm expecting. So I'd like to see if the correct context_str was passed into response_synthesizer:text_qa_template
Doesn't arize log the exact LLM input? That will contain the context_str
Add a reply
Sign up and join the conversation on Discord