Find answers from the community

Updated 3 months ago

Hi all I have a question I built a graph

Hi all, I have a question; I built a graph index on top of two index and I am using langchain to build a chatbot with memory (using create_llama_chat_agent()).

Things are working (while there is still room for optimization) but how can I get the metadata from the agent response? I would like to get the same metadata I obtain when I run the underlying graph.query() on the graph llama index. Thanks in advance.
c
L
6 comments
Yea, the current wrappers for creating tools/agents in llama index aren't flexible enough yet to provide the source metadata. The approach I described in that issue is probably the easier way for now
It would be a little tricky to add to the current agent wrappers, not sure what the best way would be πŸ€”
I believe we should have something like load_qa_with_sources_chain where we can pull the sources from the llama index
the data is there, but it is not accessible to the agent, it can be integrated in the prompt used for the generation (i.e. "always add the SOURCES")
Yea that makes sense. Definitely open to PRs for this! I'm not sure how quickly I can get to this right now πŸ˜…πŸ« 
Add a reply
Sign up and join the conversation on Discord