Find answers from the community

Updated 2 years ago

Slow response

At a glance

The community member is using a sample code from the Readme, which takes around 20-30 seconds for the API to provide an answer about their data. They are asking if this can be made faster or tweaked in an easy way. The comments suggest that the slow response time may be due to issues with the OpenAI endpoints, and that the expected answer time for a simple query should be around 5 seconds or less. Some community members suggest that if the data does not change, it is not necessary to index the same document repeatedly, and that using the saved index can improve the query time.

Hi guys, thx for the awesome work. When I use the sample Code from the Readme, it takes around 20-30 Seconds for the API to give me an answer about my data. Can this be made faster/tweaked in an easy way?
L
v
C
8 comments
Which sample code are you using?

Sometimes openAI can be quite slow 🙃
Basically this Code with a Single Input text file of about 50kb
Hmm yea, I blame openai lol

Looking at #⚠statuspage I see some recent issue with their endpoints
Thx! What would expected answer times for a simple query be in this case?
if the data dont change ... is not necessary to index the same document again and again ... if you saved one time you can use the index and query him ...
i do that in my indexed document and it work fine ...
Probably about 5s or less in my experience 🤔
Add a reply
Sign up and join the conversation on Discord