Find answers from the community

Updated 11 months ago

Llm

AttributeError: 'LLMPredictor' object has no attribute '_llm'. Did you mean: 'llm'

I am trying to reference the llm provided into a ServiceContext object but when I reference it from the service context to perform a simple complete function I get the following error:

manager.service_context.llm.complete('Hi what is 5+5?') (Manager is just a class I wrote that wraps over some LI objects, including a ServiceContext object.

My service context object: (created in the Manager class where I use type-hints and set a default)
Plain Text
    service_context: ServiceContext = ServiceContext.from_defaults(
        embed_model= AzureOpenAIEmbedding(
            model="text-embedding-ada-002"
            , azure_deployment="text-embedding-ada-002"
            , azure_endpoint=str(settings.azure_openai_api_base)
            , api_version=str(settings.azure_openai_api_version)
            , api_key=str(settings.azure_openai_api_key)
        )
        , llm = AzureOpenAI(
            model="gpt-4"
            , azure_deployment="gpt-4"
            , azure_endpoint=str(settings.azure_openai_api_base)
            , api_version=str(settings.azure_openai_api_version)
            , api_key=str(settings.azure_openai_api_key)
        )
    )


Error:
AttributeError: 'LLMPredictor' object has no attribute '_llm'. Did you mean: 'llm'?

I have in the past been able to reference the LLM from a service context.. not sure why all of a sudden its not working now. I am running llama-index 0.9.39
L
n
12 comments
Hmmm that's super sus. Looking at the source code I don't see any obvious reason why that would fail
Yeah same here. I could re-instantiate a connection to OpenAI but I'd rather not if I don't have to
I think you might have to instaniate it for now. Trying myself, I am unable to replicate this πŸ€” https://colab.research.google.com/drive/1RnkX02R2V899cpg8qiMTxgvKQapN2RJg?usp=sharing
Do you think it might have to do with my class structure? I'm working on a project that is meant to manage stateless microservice sessions over a Llama Index agent. For this reason I have one class SessionManager and another Session

SessionManager contains the following objects: (which are instantiated upon creation)
  • VectorStore
  • Vector Store Index
  • Ingestion Pipeline
  • ServiceContext
Session contains all of the objects that are built ontop of the objects above, and is created with a SessionManager as an input. So basically, whenever a Session needs a LLM it references the ServiceContext from SessionManager(service_context.llm).
I'm fairly certain I'm doing something wrong with Pydantic.. When I run this outside of my classes it runs fine, the code is almost identical. The only difference is that they exist in a plain script outside of my classes..
Just going to drop pydantic.
Hmmm yea there might be some spooky stuff happening. Are you multiprocessing? Pickling the service context maybe?
I'm running some of these methods async, I don't think I'm pickling the service context.. It is being called by reference though, or maybe its a copy and I don't realize it.
Removing Pydantic fixed it. Weird, I really wanted to get used to using Pydantic but I am on a deadline lol
Ugh pydantic has some weird side effects sometimes... glad it works now!
I see its value. Makes classes a lot prettier to look at.
But yeah, this behavior was weird.
Add a reply
Sign up and join the conversation on Discord