Find answers from the community

Updated 2 months ago

does llama index support batch requests so i can have multiple queries in one request

does llama index support batch requests so i can have multiple queries in one request
k
2 comments
Please stand by @andy as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
Yes, LlamaIndex does support batch requests. You can use the get_text_embedding_batch method to process multiple queries in one request. Here is an example of how you can do it:

Plain Text
texts = ["hello world", "goodbye world"]
response_batch = model.get_text_embedding_batch(texts)
print(response_batch)

This code will process the texts "hello world" and "goodbye world" in a single batch request. The response will be a batch of text embeddings corresponding to the input texts.

For more details, you can refer to the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord