Find answers from the community

S
Saúl
Offline, last seen 2 months ago
Joined September 25, 2024
hi everyone, I'm building a tool to analyze long youtube lectures using an agent-based workflow, but i'm running into issues with managing the large context. What whould be the best approach or tools to handle this efficiently without losing important information?

What I'm currently doing is splitting the text into fragments and passing each fragment through gpt4o-mini, but the result is still too long to be processed in the agent workflow
4 comments
L
S
A
S
Saúl
·

Llamaparse

Hi everyone, I'm creating my first large project with LlamaIndex. I would like to know which option you recommend for processing complex PDFs with tables, etc.
2 comments
S
W
S
Saúl
·

Webinars

Is there any place where we can watch past webinars?
5 comments
S
L