Find answers from the community

Updated 3 months ago

Hi, I'm trying out the new features in

Hi, I'm trying out the new features in version 0.9. How do I pass the service context correctly to the transformations pipeline so that I use a local llm rather than OpenAI for the TitleExtractor(). My code is:
Plain Text
    service_context = ServiceContext.from_defaults(embed_model=embed_model, llm=llm)
    pipeline = IngestionPipeline(
        service_context=service_context,
        transformations=[
            SentenceSplitter(),
            TitleExtractor(),
        ]
    )
The error I get is Could not load OpenAI model. but my llm is defined as Llama 2.
W
k
3 comments
try passing like this once

Plain Text
    service_context = ServiceContext.from_defaults(embed_model=embed_model, llm=llm)
    pipeline = IngestionPipeline(
        
        transformations=[
            SentenceSplitter(),
            TitleExtractor(),
            service_context.embed_model
        ]
    )
Thanks, but with TitleExtractor, I still get a missing OpenAI key error. My service_context model is defined using LlamaCPP (Mistral).
Actually, this solved it: TitleExtractor(llm=llm) Thanks for the help
Add a reply
Sign up and join the conversation on Discord