Find answers from the community

s
F
Y
a
P
Updated 4 months ago

does llama index support batching

does llama index support batching requests? for example asking 32 questions in one request rather than 32 unique calls?

open ai has something like this:

https://platform.openai.com/docs/guides/rate-limits/error-mitigation

but not sure about llama index
L
1 comment
not at the moment. Only async/concurrancy
Add a reply
Sign up and join the conversation on Discord