Find answers from the community

Updated 12 months ago

Guys any on this - Hi, is it possible to

At a glance

The community members are discussing how to get the source data mentioned along with the answer in LlamaIndex. One community member suggests using the "create and refine" mode to include source metadata in the response. Other community members suggest extracting source_nodes and printing their text to save tokens, as well as using the CitationQueryEngine together with another QueryEngine like RaptorRetriever. The community members also discuss passing the index along with the retriever to the CitationQueryEngine arguments.

Useful resources
Guys any on this - Hi, is it possible to get this type of reponse in llamaindex - where the source data is getting mentioned along with the answer. basically getting the source data for each part of the response. One obvious way to do is Ig my prompting the llm in the "create and refine" mode to mention source metadata along with the answer.
1
W
L
P
7 comments
Yeah that is one way + you can always extract source_nodes from the response and add node text or metadata info in your final answer.
in order to save tokens you should print source_nodes.get_text()
hey, I was asking if we can mention the source nodes in the reponse partwise. like this-
Attachment
Screenshot_2024-03-21_at_6.50.06_AM.png
its basically mention which part of the answer is from which source
Coudl we use the CitationQueryEngine together with another QueryEngine? For example I use the RaptorRetriever, but I'd also like tok try the CitationQueryEngine
Add a reply
Sign up and join the conversation on Discord