Llama Index Answering Queries with Relevant Text Sources
Llama Index Answering Queries with Relevant Text Sources
At a glance
The community member is working with LlamaIndex to get answers for their own documents using Azure OpenAI, and they are trying to find a way to get the relevant piece of text from which LlamaIndex generated the answer, rather than just the entire relevant document. The comments discuss some approaches, such as fuzzy matching and using a CitationQueryEngine, but note that these may not be entirely reliable. There is no explicitly marked answer in the comments.
Hello, I am working with llama index to get answer for my own document with azure open ai, I am trying to find if llama index has functionality which can give the relevant piece of text from which it answered the query? I wound get_formatted_sources() but it doesn't provide relevant piece of text but the whole relvant document
The sources are the text that the LLM read to write the answer. Hard to get more specific than that in a reliable way. A common approach is fuzzy matching
The other approach is using something like the CitationQueryEngine (or making your own). But all its doing is splitting the text into smaller numbered chunks, and asking the llm to cite its sources. A little unreliable imo
I thought sourced nodes in the response object would give you the relevant nodes but I’m having a problem with that and just posted a question about it.