Find answers from the community

Updated last year

I have a basic RAG with TheBloke Llama 2

I have a basic RAG with TheBloke/Llama-2-13B-chat-GPTQ and BAAI/bge-large-en-v1.5 working on 2 PDF docs. I would like:
  1. the response to include the source document
  2. enable to continuous conversation with chat history
Can this be done with llama-index?
L
S
2 comments
  1. check response.source_nodes
  1. check out the context chat engine (llama2 wont work well with any other chat engine type is my guess)
https://gpt-index.readthedocs.io/en/stable/examples/chat_engine/chat_engine_context.html
Thank you Sir
Add a reply
Sign up and join the conversation on Discord