Find answers from the community

Updated 3 months ago

Hi! I'm trying to determine how I can

Hi! I'm trying to determine how I can get the reference (text string and doc name) for a simple RAG app I'm building with local pdfs. I'm using the framework from create-llama ( LlamaParse too) + weaviate. I'm looking at the following line in chat.py:
Plain Text
 response = await chat_engine.astream_chat(lastMessage.content, messages)


And I'm wondering how I could do this? Do I need to provide a system prompt and hope the LLM returns with something or can I specify something so it returns whatever context str (+ sources) it worked with?
T
v
A
6 comments
Depends on how you want to implement it, if you do
Plain Text
print(response.source_nodes)
you can access the sources for the response which should include the doc name. There is also the CitationQueryEngine if you want to have in-text citations. https://docs.llamaindex.ai/en/stable/examples/query_engine/citation_query_engine.html

You can also add it in your prompt if you want the answer to reference a specific part of your sources
I saw some of this in the python objects. Is this the kind of information you would store in a vector db like qdrant?

hadn't seen the CQE, that looks neat!
Does the chunking process expose the row/col or offset into a document?
to answer my own question, at least initially, there is a good start here: https://docs.llamaindex.ai/en/stable/module_guides/loading/documents_and_nodes/root.html
Any docs on how I could do index_store.as_query_engine()? I'm trying to find something in the source code but am struggling to find it
Add a reply
Sign up and join the conversation on Discord