Find answers from the community

Updated 7 months ago

does llama index support batching

At a glance
does llama index support batching requests? for example asking 32 questions in one request rather than 32 unique calls?

open ai has something like this:

https://platform.openai.com/docs/guides/rate-limits/error-mitigation

but not sure about llama index
L
1 comment
not at the moment. Only async/concurrancy
Add a reply
Sign up and join the conversation on Discord