I'm curious if llma-index offers tools for creating a summarization app that doesn't necessarily depend on llm's. I'm specifically interested in vector-based solutions or smaller summarization models.
Handling large texts often involves chunking and overlaps, which can be challenging to implement independently outside of Llama-index or Langchain. Additionally, I'm exploring how llama-index implements both extractive and abstractive summarization. Thanks In advance!!
I'm curious if llma-index offers tools for creating a summarization app that doesn't necessarily depend on llm's. -- it doesn't π Your options are sequential or tree based summary approaches with LLMs
It sounds like you have enough of a grasp on the topic that you could build something yourself. Without LLMs, you are looking at extractive approaches (llms are of course abstractive summary machines)