Find answers from the community

Updated 5 months ago

Is there any way I can add retry logic

Is there any way I can add retry logic to an advanced text-to-sql pipeline? My pipeline is defined below,which is straight from llamaindex docs:

https://docs.llamaindex.ai/en/stable/examples/pipeline/query_pipeline_sql/?h=table+parser#define-expanded-query-pipeline
Plain Text
from llama_index.core.query_pipeline import (
    QueryPipeline as QP,
    Link,
    InputComponent,
    CustomQueryComponent,
)

qp = QP(
    modules={
        "input": InputComponent(),
        "table_retriever": obj_retriever,
        "table_output_parser": table_parser_component,
        "text2sql_prompt": text2sql_prompt,
        "text2sql_llm": llm,
        "sql_output_parser": sql_parser_component,
        "sql_retriever": sql_retriever,
        "response_synthesis_prompt": response_synthesis_prompt,
        "response_synthesis_llm": llm,
    },
    verbose=True,
)

The links and chains are also taken straight from the docs.

I want to implement retry logic such that if the SQL statement and its result is incorrect (incorrect syntax or doesn't address the prompt, for example), it should like "fix" the SQL statement and check it & the results again, and keep doing it until it is correct. (maybe max 3 tries). The problem is I am trying to follow the documentation here: https://docs.llamaindex.ai/en/stable/examples/agent/agent_runner/query_pipeline_agent/#setup-simple-retry-agent-pipeline-for-text-to-sql

But I am unclear as to how exactly I can add this to my already defined pipeline (above). Any help is appreciated, thanks
L
G
8 comments
You could wrap the pipeline in a while loop with a try/except and retry ?
Or write a custom module that retries the text2sql -> retrieval portion
Yes but I want to wrap the pipeline in an agent anyway so that I can chat back and forth with it
I will try this thanks
I think I will try and figure out how to wrap an agent around the pipeline, then implement the retry logic
yea if you just need memory, thats a valid option
Actually in the time I almost implemented the agent with my pipeline:

Plain Text
qp = QueryPipeline(
    modules={
        "agent_input": agent_input_component, #new
        "retry_prompt": retry_prompt, #new
        "agent_llm": llm,
        "table_retriever": obj_retriever,
        "table_output_parser": table_parser_component,
        "text2sql_prompt": text2sql_prompt,
        "text2sql_llm": llm,
        "sql_output_parser": sql_parser_component,
        "sql_retriever": sql_retriever,
        "text_extractor": TextExtractor(),
        "response_synthesis_prompt": response_synthesis_prompt,
        "response_synthesis_llm": llm,
        "output_component": output_component, #new
    },
    verbose=True, # Prints intermediate steps of the pipeline.
)

I am just confused on the purpose of the retry prompt, as defined here: https://docs.llamaindex.ai/en/stable/examples/agent/agent_runner/query_pipeline_agent/#define-core-modules

Does the retry prompt tell the LLM to generate another natural language query or an SQL query. it seems like it does both depending if it has a chat history feature or not?
Add a reply
Sign up and join the conversation on Discord