Find answers from the community

Updated 3 months ago

How to use use 'gpt-4' instead of defult "text-davinci-002" model while using with query_

How to use use 'gpt-4' instead of defult "text-davinci-002" model while using with query_engine = SQLTableRetrieverQueryEngine(
sql_database,
object_index.as_retriever(similarity_top_k=3),
)
k
L
v
15 comments
Please stand by @vinodvarma24 as I search the knowledge base. We are experiencing especially high traffic right now and might take longer than usual. :wave:
----------------------
To use the 'gpt-4' model instead of the default 'text-davinci-002' model with SQLTableRetrieverQueryEngine, you need to define the LLM (Language Model) with the 'gpt-4' model and pass it to the ServiceContext. Here is an example of how you can do it:

Plain Text
from llama_index import (
    SQLTableRetrieverQueryEngine,
    LLMPredictor,
    ServiceContext
)
from llama_index.llms import OpenAI

# define LLM
llm = OpenAI(temperature=0, model="gpt-4")
service_context = ServiceContext.from_defaults(llm=llm)

# construct the SQLTableRetrieverQueryEngine
query_engine = SQLTableRetrieverQueryEngine(
    sql_database,
    object_index.as_retriever(similarity_top_k=3),
    service_context=service_context
)


In this code, OpenAI(temperature=0, model="gpt-4") is used to define the LLM with the 'gpt-4' model. This LLM is then passed to the ServiceContext. The ServiceContext is then passed as an argument to the SQLTableRetrieverQueryEngine.

Please note that you need to replace sql_database and object_index with your actual SQL database and object index.

This information is based on the extract from the Custom LLMs guide () in the LlamaIndex documentation.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
you can also set a global service context instead of passing it in

Plain Text
from llama_index import set_global_service_context

set_global_service_context(service_context)
@kapa.ai What is the best way to customize the prompt for SQL Query Engine
Let me check the knowledge base for you. One moment! :books:
----------------------
To customize the prompt for the SQL Query Engine in LlamaIndex, you can pass a custom prompt to the text_to_sql_prompt and response_synthesis_prompt parameters when creating an instance of SQLTableRetrieverQueryEngine.

Here's an example of how you can do it:

Plain Text
from llama_index import Prompt, SQLTableRetrieverQueryEngine

# Define a custom prompt
template = (
    "We have provided context information below. \n"
    "---------------------\n"
    "{context_str}"
    "\n---------------------\n"
    "Given this information, please answer the question: {query_str}\n"
)
custom_prompt = Prompt(template)

# Construct the SQLTableRetrieverQueryEngine with the custom prompt
query_engine = SQLTableRetrieverQueryEngine(
    sql_database,
    table_retriever,
    text_to_sql_prompt=custom_prompt,
    response_synthesis_prompt=custom_prompt
)


In this code, Prompt(template) is used to define a custom prompt. This custom prompt is then passed to the text_to_sql_prompt and response_synthesis_prompt parameters of SQLTableRetrieverQueryEngine.

Please replace sql_database and table_retriever with your actual SQL database and table retriever.

This information is based on the extracts from the Custom Prompts guide () and the SQL Query Engine reference () in the LlamaIndex documentation.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai What is the difference between text_to_sql_prompt and response_synthesis_prompt here ?
Let me check the knowledge base for you. One moment! :books:
----------------------
The text_to_sql_prompt and response_synthesis_prompt are both parameters of the SQL Query Engine in LlamaIndex, but they serve different purposes:

  • text_to_sql_prompt: This is a prompt used to convert a natural language query into a SQL query. It's used when the SQL Query Engine needs to translate the user's question (expressed in natural language) into a SQL query that can be executed against the database.
  • response_synthesis_prompt: This is a prompt used to synthesize the final response from the results of the SQL query. After the SQL query has been executed and the results have been retrieved, this prompt is used to format those results into a natural language response that can be returned to the user.
In both cases, the prompts are used to guide the language model in performing a specific task. By customizing these prompts, you can control how the language model translates questions into SQL queries and how it formats the results of those queries into responses.

This information is based on the extracts from the SQL Query Engine reference () in the LlamaIndex documentation.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@Logan M Just wanted to let you know that whatever the referene links kapa.ai is giving are not working, may be the links are changed.
Yea kapa takes a while to update when the docs change. Feel free to use the search on the docs though, it works really well now
@Logan M Where do i find the response_synthesis_prompt as mentiond by kapa.ai to customize my output from the SQL Query Engine?
Prompts are not organized the best, sorry about that haha
Add a reply
Sign up and join the conversation on Discord