Hello, I am working with llama index to get answer for my own document with azure open ai, I am trying to find if llama index has functionality which can give the relevant piece of text from which it answered the query? I wound get_formatted_sources() but it doesn't provide relevant piece of text but the whole relvant document
The sources are the text that the LLM read to write the answer. Hard to get more specific than that in a reliable way. A common approach is fuzzy matching
The other approach is using something like the CitationQueryEngine (or making your own). But all its doing is splitting the text into smaller numbered chunks, and asking the llm to cite its sources. A little unreliable imo
I thought sourced nodes in the response object would give you the relevant nodes but I’m having a problem with that and just posted a question about it.