Find answers from the community

Updated 2 weeks ago

Timeout

Hi I am builiding rag with various LLMs, some openai ones some using llama_index.llms.azure_inference, and I just realized that default llamaindex LLM class doesn't come with timeout which is built in OpenAI. I wonder if there are some easy ways to implement the timeout, or I missed something?
Add a reply
Sign up and join the conversation on Discord