how to make this llama pack work with a local llm?
https://docs.llamaindex.ai/en/stable/examples/llama_hub/llama_pack_resume.htmlhttps://github.com/run-llama/llama-hub/tree/2c42ff046d99c1ed667ef067735e77364f9b6b7a/llama_hub/llama_packs/resume_screeneri am getting all kinds of errors trying to make this work, from not using openai, to not using an embed model, to Tree Summarize issues, to json issues
I tried using the pack directly in my code using this line to use my own llm:
resume_screener = ResumeScreenerPack(
job_description=job_description,
criteria=[
"2+ years of experience in one or more of the following areas: machine learning, recommendation systems, pattern recognition, data mining, artificial intelligence, or related technical field",
"Experience demonstrating technical leadership working with teams, owning projects, defining and setting technical direction for projects",
"Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience.",
],
llm=llm,
)
I also changed this line to add an embed model:
service_context = ServiceContext.from_defaults(llm=llm, embed_model=embed_model)
Then i used my own pdf ofcourse
response = resume_screener.run(resume_path="pathtofile")
Still cant get it to work
Idk but normally i am used to using the query engine or something, could it be there is missing something before we can use local llm?