The community members are discussing how to get the source data mentioned along with the answer in LlamaIndex. One community member suggests using the "create and refine" mode to include source metadata in the response. Other community members suggest extracting source_nodes and printing their text to save tokens, as well as using the CitationQueryEngine together with another QueryEngine like RaptorRetriever. The community members also discuss passing the index along with the retriever to the CitationQueryEngine arguments.
Guys any on this - Hi, is it possible to get this type of reponse in llamaindex - where the source data is getting mentioned along with the answer. basically getting the source data for each part of the response. One obvious way to do is Ig my prompting the llm in the "create and refine" mode to mention source metadata along with the answer.
Coudl we use the CitationQueryEngine together with another QueryEngine? For example I use the RaptorRetriever, but I'd also like tok try the CitationQueryEngine