Find answers from the community

Updated last year

Hi how to deal with the token limit

Hi, how to deal with the token limit error?

I have a large dataset, for example amazon sales data. If I use text-to-sql it will show a token limit error right? How to resolve this issue?


My aim is to create a chatbot for amazon sales data, query and visualise on top of it. I'm planning to use openai api along with the text-to-sql llama index function.

Any help is appreciated.
L
1 comment
when your SQL query returns a large amount of data, yea it will have some token error

This is mostly tech debt, the library can be updated to handle this.

Maybe one workaround for now, you could try/catch the error, then run the query again, but use a text2sql query engine that has synthesize_response=False, so that only the raw SQL is returned?
Add a reply
Sign up and join the conversation on Discord