Find answers from the community

Updated 3 months ago

RAG

Hello ! I have loaded a large amount of data into LlamaIndex, and I am looking for a way to:

  • Restrict responses to only the data I have provided myself. If a query pertains to information not present in my data, I would like to receive something like: "Sorry, I don't have knowledge of that."
  • Include references for each generated response. Whenever a response is provided, it would be very helpful if it were accompanied by the documents and line numbers from which the information was sourced. This would allow for increased transparency and traceability.
Could you please guide me on the best way to implement these features with LlamaIndex?
W
A
2 comments
  1. You can modify prompt for your query in such a way that it restricts the llm to only answer from the context.
https://docs.llamaindex.ai/en/stable/module_guides/models/prompts.html#usage-pattern
You can also use post processors to fetch only high quality nodes from the content.
https://docs.llamaindex.ai/en/stable/module_guides/querying/node_postprocessors/node_postprocessors.html#similaritypostprocessor

  1. All the responses contains source nodes which has the all the nodes used for generating response.
You'll get all the text used, metadata and similarity score

Plain Text
response = query_engine.query("Oh interesting, tell me more.")
print(response.source_nodes)
Thank you πŸ™‚
Add a reply
Sign up and join the conversation on Discord