Hi all, I have a question; I built a graph index on top of two index and I am using langchain to build a chatbot with memory (using create_llama_chat_agent()).
Things are working (while there is still room for optimization) but how can I get the metadata from the agent response? I would like to get the same metadata I obtain when I run the underlying graph.query() on the graph llama index. Thanks in advance.
Yea, the current wrappers for creating tools/agents in llama index aren't flexible enough yet to provide the source metadata. The approach I described in that issue is probably the easier way for now