I have used LangChain and LLama as a indexes and notice one major difference: Langchain give me answer based on several sources (not only one document) and return me these sources. Llama always give me response based on one single document. Is it possible to the same as Langchain has? I mean answers based on several sources?