Find answers from the community

Home
Members
prathamesh4585
p
prathamesh4585
Offline, last seen 3 months ago
Joined September 25, 2024
Hello , and everybody, I have one doubt,
I don't want SQLTableRetrieverQueryEngine to give schema of database objects also I don't want user to be able to drop or delete tables. I also want to stop users from access info about other objects which are not in include_tables param. I have tried overriding run_sql function in SQLDatabase class but it does not seem to work. Is there any other way? am I doing it wrong? please help

my code to overide function:
from llama_index import SQLDatabase
from typing import Dict, Tuple

class MySQLDatabase(SQLDatabase): # Assuming SQLDatabase is the original class
def run_sql(self, command: str) -> Tuple[str, Dict]:
"""Override the run_sql method to add custom functionality."""
# Check for DDL, DML, and schema-related commands
ddl_commands = {'CREATE', 'ALTER', 'DROP', 'TRUNCATE', 'COMMENT'}
dml_commands = {'INSERT', 'UPDATE', 'DELETE', 'MERGE'}
schema_commands_mysql = {'information_schema.','sqlite_master'} # Adjust for MySQL

if any(cmd in command.upper() for cmd in ddl_commands.union(dml_commands)):
raise ValueError("I appreciate your input! Let's keep our conversation safe and avoid any database modifications.")

if any(cmd in command.lower() for cmd in schema_commands_mysql):
raise ValueError("I appreciate your input! However, retrieving schema information is not allowed for security reasons.")


return super().run_sql(command)
7 comments
p
W
L
Hello @Logan M and everybody
I want my llama-index RAG chatbot to remember past question for context, can I follow the format openai has for chat completion? here message will go to llama-index bot instead of this?

response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Who won the world series in 2020?"},
{"role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020."},
{"role": "user", "content": "Where was it played?"}
])

also, how to manage if message get close to context length for give openai model? is there any better way to make model remember past questions?
4 comments
p
W
hello @kapa.ai what can I do if i want deterministic output from my engine? it should only return json in a structure I specify. How can I achieve this?
3 comments
k
@kapa.ai can I use multistep query engine with sql engine?
8 comments
k
p
Hello guys,
I am working on a project where user will ask natural language queries and this llama-index based engine will convert that natural language to sql query and execute it on my database and give answer in natural language to the user. Problem is it is only able to execute one query per question so comparison quetions are not possible to answer and also if a question does not require querying the database it will still query the database. How can I solve this. Please help me with your suggesting.
Thanks in advance.
15 comments
L
p
k