Find answers from the community

Updated 10 months ago

Hello. I'm trying to output an

At a glance

The community member is trying to output an intermediate step of a query pipeline into one output of the same query pipeline. Another community member responds that this functionality has just been added, and they will provide an example in about 30 minutes. A third community member suggests using .run_multi and adding a new node with no other connection, where the final answer will be a JSON containing the LLM output and the intermediate step. The community members also mention a relevant thread in the LlamaIndex documentation.

Useful resources
Hello. I'm trying to output an intermediate step of a query pipeline into one output of the same query pipeline. Do you know if this is possible? If yes, may you refer me to an example? Not the streaming one. I didn't understood that one 😦
L
a
5 comments
Actually we JUST added this, give me like 30 mins and I can link to the example once it is released
@Logan M Hello.. is there a way? I found out that you can use .run_multi and add like a new node that has no other connection. The final answer will be composed of a json with the llm output and the thing that you needed in the intermediate step
Add a reply
Sign up and join the conversation on Discord