Find answers from the community

Updated 3 months ago

Has anyone here tried document

Has anyone here tried document summarization with lama index and local llms? I could use some help.
C
W
d
8 comments
Local llms if smaller than 13b are not good at following instructions.


You could check the LLM compatibility here: https://docs.llamaindex.ai/en/stable/module_guides/models/llms.html#open-source-llms
@WhiteFang_Jr I am using a basic query engine with llama v2 13b . It should be able to summarise as long as it has access to the right chunks. How can I make that happen?
@Cipher Studies what issues are you running into? There should be a bunch of resources on how to use llamaindex with local models in the docs, e.g.: https://docs.llamaindex.ai/en/stable/module_guides/models/llms/local.html
basically I want fine control of how chunk retrieval works. I want to retrieve only those chunks which have a metadata {file_name : myFile.txt} . How can I achieve this?
ill look at the docs you provided too, thanks.
i solved the task temporarily by indexing each file separately and then using a SummaryIndex query engine for each file. Its very slow but it works.
Add a reply
Sign up and join the conversation on Discord