Find answers from the community

Updated 3 months ago

Non OpenAI

May LLAMA index be used with other models than Openai? Let us say local LLAMA. How to do it? Thanks
W
k
15 comments
hey!
Yes you can use LlamaIndex with other LLMs very easily. And we have a lot of integrations with diff LLMs

Checkout this guide here for available Integrations: https://docs.llamaindex.ai/en/stable/module_guides/models/llms.html#modules


Also here is an example of Integrating Llama2 in LlamaIndex: https://docs.llamaindex.ai/en/stable/examples/llm/llama_2_llama_cpp.html
Hey! Thanks for reply. Is there an example of using any llama packs with LLAMA 2 locally?
Not sure if there is, But you can download the pack and replace the llm part with your newly created Llama2 llm object and it will work
Thanks, please tell: Let us say I wan to use LLAMA 2 with this pack, what should I replace
https://llamahub.ai/l/llama_packs-resume_screener?from=all
Also pass embed model in there otherwise it will start looking for OpenAI key for embedding. Code would change to something like this:

Plain Text
class ResumeScreenerPack(BaseLlamaPack):
    def __init__(
        self, job_description: str, criteria: List[str], llm: Optional[LLM] = None  # PASS YOUR LLAMA2 LLM
    ) -> None:
        self.reader = PDFReader()
        llm = llm or OpenAI(model="gpt-4")    
        service_context = ServiceContext.from_defaults(llm=llm, embed_model='local') # ADD EMBED_MODEL to avoid falling on OpenAI 
        self.synthesizer = TreeSummarize(
            output_cls=ResumeScreenerDecision, service_context=service_context
        )
        criteria_str = _format_criteria_str(criteria)
        self.query = QUERY_TEMPLATE.format(
            job_description=job_description, criteria_str=criteria_str
        )
Thanks! Mby there is guide to integrating local ollama running on localhost with llama index?
@WhiteFang_Jr very strange errors appear when I try llam2 and mistral:instruct https://github.com/klimovskis/Resume_screening_using_Ollama_and_Llama_index/tree/main
May you please assist?
Sure, what error are you getting?
Thanks for reply! Here are the errors. They are different if launching llama2 and mistral:instruct. Here is for mistral:instruct
Attachments
Screenshot_2024-02-03_at_13.18.52.png
Screenshot_2024-02-03_at_13.18.57.png
I think open source model is not able to convert the answer into pydantic format.


Open source models are not good for these cases.
Sharing the compatibility report for open source llm.
https://docs.llamaindex.ai/en/stable/module_guides/models/llms.html#open-source-llms
You could see that llama2 and mistral are not very good at handling pyndantic scenarios.
Thanks, I see. So what sould be your approach?
If you check the compatibility report, you'll find that there are some open-source llms that are capable of handling pydantic output.

You can try those and see if any of them works for you
Add a reply
Sign up and join the conversation on Discord