The community member is trying to output an intermediate step of a query pipeline into one output of the same query pipeline. Another community member responds that this functionality has just been added, and they will provide an example in about 30 minutes. A third community member suggests using .run_multi and adding a new node with no other connection, where the final answer will be a JSON containing the LLM output and the intermediate step. The community members also mention a relevant thread in the LlamaIndex documentation.
Hello. I'm trying to output an intermediate step of a query pipeline into one output of the same query pipeline. Do you know if this is possible? If yes, may you refer me to an example? Not the streaming one. I didn't understood that one π¦
@Logan M Hello.. is there a way? I found out that you can use .run_multi and add like a new node that has no other connection. The final answer will be composed of a json with the llm output and the thing that you needed in the intermediate step