@Logan M I have a couple of questions regarding your "Discover LlamaIndex: Bottoms-Up Development with LLMs (Part 3, Evaluation)" yt video :-
- First of all amazing video, very informative but I am not able to find the link to the jupyter-notebook in this video.
- Also, in this video you created a big_document by concatenating all the smaller ones citing the reason that it will lead to lesser LLM calls but won't the context window become too big then? Is this a good thing to do to improve performance of our LLM based apps!?
- How are you able to print the time taken at the bottom of each run? π₯Ή