Log in
Log into community
Find answers from the community
Most posts
Powered by
Hall
Home
Members
_hodophile_
_
_hodophile_
Offline
, last seen 4 weeks ago
Joined November 27, 2024
Contributions
Comments
Replies
_
_hodophile_
4 weeks ago
·
Improving Inference Time for Llama Index with Parallel Query Engines
Hi,
I am noobie in llama index. I just wanted to do rag on multiple pdf docs (10 to 100 docs). Is there a way i can create queryengine in parallel and also how can i improve the inference time.
Thanks
7 comments
T
W
_