Find answers from the community

Updated 3 weeks ago

Llama Index Answering Queries with Relevant Text Sources

Hello, I am working with llama index to get answer for my own document with azure open ai, I am trying to find if llama index has functionality which can give the relevant piece of text from which it answered the query? I wound get_formatted_sources() but it doesn't provide relevant piece of text but the whole relvant document
L
p
S
6 comments
The sources are the text that the LLM read to write the answer. Hard to get more specific than that in a reliable way. A common approach is fuzzy matching
But fuzzy matching wouldn’t be that accurate
It'd be pretty close
The other approach is using something like the CitationQueryEngine (or making your own). But all its doing is splitting the text into smaller numbered chunks, and asking the llm to cite its sources. A little unreliable imo

Open to other ideas 🙂
I thought sourced nodes in the response object would give you the relevant nodes but I’m having a problem with that and just posted a question about it.
Add a reply
Sign up and join the conversation on Discord