Find answers from the community

Updated 2 months ago

how to make this llama pack work with a

how to make this llama pack work with a local llm?
https://docs.llamaindex.ai/en/stable/examples/llama_hub/llama_pack_resume.html
https://github.com/run-llama/llama-hub/tree/2c42ff046d99c1ed667ef067735e77364f9b6b7a/llama_hub/llama_packs/resume_screener
i am getting all kinds of errors trying to make this work, from not using openai, to not using an embed model, to Tree Summarize issues, to json issues

I tried using the pack directly in my code using this line to use my own llm:
Plain Text
resume_screener = ResumeScreenerPack(
    job_description=job_description,
    criteria=[
        "2+ years of experience in one or more of the following areas: machine learning, recommendation systems, pattern recognition, data mining, artificial intelligence, or related technical field",
        "Experience demonstrating technical leadership working with teams, owning projects, defining and setting technical direction for projects",
        "Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience.",
    ],
    llm=llm,
)

I also changed this line to add an embed model:
Plain Text
service_context = ServiceContext.from_defaults(llm=llm, embed_model=embed_model)

Then i used my own pdf ofcourse
Plain Text
response = resume_screener.run(resume_path="pathtofile")

Still cant get it to work
Idk but normally i am used to using the query engine or something, could it be there is missing something before we can use local llm?
L
B
4 comments
This is probably going to work very badly on less capable LLMs.

Its relying on the LLM to output JSON, which tbh is pretty difficult these days
your changes make sense for swtiching to local llms though
ah that explains a lot, could i use some wrapper to turn non-json into json that would make this work, or am i thinking too easily?
thinking about it, it has to be wrapped inside llamaindex, because somewhere its expecting the json, idk if it can be worked around somehow or there is another way, or i should just try to make the code work for my case
Add a reply
Sign up and join the conversation on Discord