Find answers from the community

Updated 9 months ago

Hi there, is there way using llama-index

Hi there, is there way using llama-index to summarize a text ? Basically i have list of texts and i want to loop through them to get an overall summary. For example I have texts as follows

I want to have a summarization block before document querying

text = ['paragraph1', 'pargaraph2'.......]
W
T
6 comments
You can use your LLM and generate summary.

print(Settings.llm.complete("ADD PROMPT AND PARAGRAPGH")
thanks for the feedback from the example something like this would work ?
print(Settings.llm.complete(f" Summarize the above {text[0]}") and can we setup chunk size, overlaps and other parameters ?
Summarise the below* , But I would add more definition in the prompt like state a proper instruction , What not to do while summarising etc.

By chunk, do you mean max_tokens to generate?

Plain Text
from llama_index.core import Settings
from llama_index.llms.openai import OpenAI
Settings.llm = OpenAI(system_prompt="ADD_YOUR_SYTEM_PROMPT_HERE", max_tokens=512)
to chunk basically to break down the paragraph. into smaller parts and input them. I am using huggingFace LLM , so i guess the function call be a bit different
Input them as a query or as a node?
I think as a query. So basically as step one i would like to get summary , write to a text file. Once i have series of documents as such , i would like a simple qa on the summarized document. The use case for context is to use reddits api to get posts and comments summarized or explained instead of going through them one by one ...
Add a reply
Sign up and join the conversation on Discord