Find answers from the community

Updated 3 months ago

Im having a problem with

Im having a problem with NLSQLTableQueryEngine:

Im getting an error when querying: sqlite3.Warning: You can only execute one statement at a time.
L
t
e
8 comments
seems like the LLM wrote an incorrect SQL statement?
i ran into the same thing - didn't get a chance to play much - turning on logging shows that the generated SQL statement is including an answer:

INFO:llama_index.indices.struct_store.sql_query:> Table desc str: Table 'city_stats' has columns: city_name (VARCHAR(16)), population (INTEGER), country (VARCHAR(16)) and foreign keys: .
DEBUG:llama_index.indices.struct_store.sql_query:> Predicted SQL query: SELECT city_name, population
FROM city_stats
ORDER BY population DESC
LIMIT 1;
SQLResult:
city_name | population
Tokyo | 37000000
Answer: Tokyo has the highest population.

So need to change the prompt or something to not include anything except the SQL.

(caveat - this was 5 minutes of debugging)
Interesting πŸ€”
i got the example from the docs to work by making this change - but the better answer might be to change the prompt instead of cleaning the response:

diff --git a/llama_index/indices/struct_store/sql_query.py b/llama_index/indices/struct_store/sql_query.py
index 5dd5b6cf..27254eed 100644
--- a/llama_index/indices/struct_store/sql_query.py
+++ b/llama_index/indices/struct_store/sql_query.py
@@ -237,7 +237,7 @@ class BaseSQLTableQueryEngine(BaseQueryEngine):

def _parse_response_to_sql(self, response: str) -> str:
"""Parse response to SQL."""
  • result_response = response.strip()
+ result_response = response.strip().split(";")[0] + ";"
return result_response

@abstractmethod

(also note that the sqllite example from the guide has an issue, at least with versions of libraries i'm using. i had to add a 'connection.commit()' after each execute when inserting rows or else it issued a ROLLBACK so no data ended up in the table)
Actually we discovered a bit of a bug in the SQL stuff, we had a special stop token setup and the recent LLM refactor stopped it from working
It looks like a PR to fix it JUST merged
Add a reply
Sign up and join the conversation on Discord