Find answers from the community

Updated 5 months ago

does llama index support batch requests so i can have multiple queries in one request

At a glance

The community member asked if LlamaIndex supports batch requests to process multiple queries in a single request. Another community member responded that yes, LlamaIndex does support batch requests, and provided an example of how to use the get_text_embedding_batch method to process multiple texts in a single request. The community member noted that the response will be a batch of text embeddings corresponding to the input texts, and suggested referring to the LlamaIndex documentation for more details.

does llama index support batch requests so i can have multiple queries in one request
k
2 comments
Please stand by @andy as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
Yes, LlamaIndex does support batch requests. You can use the get_text_embedding_batch method to process multiple queries in one request. Here is an example of how you can do it:

Plain Text
texts = ["hello world", "goodbye world"]
response_batch = model.get_text_embedding_batch(texts)
print(response_batch)

This code will process the texts "hello world" and "goodbye world" in a single batch request. The response will be a batch of text embeddings corresponding to the input texts.

For more details, you can refer to the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord