Find answers from the community

Updated 3 months ago

Does query engine come with retry

Does query engine come with retry mechanism? I am getting Request Timeout when calling Azure OpenAI.
L
c
8 comments
The openai client has retries built in, but you might just need to increase the timeout
Is there a way to log out if openai did a retry or not?
I get this trace. I'm guessing it's not even calling the LLM.

Plain Text
  | **********
  | Trace: query
  |     |_query ->  367.368937 seconds
  |       |_retrieve ->  0.94417 seconds
  |         |_embedding ->  0.191088 seconds
  |       |_synthesize ->  366.424662 seconds
  |         |_templating ->  6e-06 seconds
  |         |_llm ->  0.0 seconds
  | **********
  | **********
  | Trace: query
  |     |_query ->  367.368937 seconds
  |       |_retrieve ->  0.94417 seconds
  |         |_embedding ->  0.191088 seconds
  |       |_synthesize ->  366.424662 seconds
  |         |_templating ->  6e-06 seconds
  |         |_llm ->  0.0 seconds
  | **********
sorry I mean the synthesize part is lagging.
I think if you set the logging to debug you can see what openai is doing
Plain Text
import logging
import sys

logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)
logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout))
oh nice. thanks. I'm guessing that's more detailed than llama index set handler simple.
yea, that will show the debug logs from openais client
Add a reply
Sign up and join the conversation on Discord