Hi all, I'm relatively new to llama index. I was wondering if anyone could advise me on how go about showing/logging the chunks retrieved to answer the intermediate questions when using SubQuestionQueryEngine.
I asked kapa which advised me to modify the source code, I assume there is already a method to show what the chunks that have been retrieved which I can either use or atleast take inspiration from.
https://discordapp.com/channels/1059199217496772688/1123701370000769155I'd appreciate it if anyone could advise of point me towards any useful documentation.