Find answers from the community

Updated 2 months ago

`UnboundLocalError local variable

UnboundLocalError: local variable 'default_template' referenced before assignment

hmm
t
L
24 comments
I set query_wrapper_prompt only in a service_context
Plain Text
...
...
    query_wrapper_prompt = "[INST] {query_str} [/INST]"
    embed_model='local:BAAI/bge-base-en'

    server_url = os.getenv('TGIS_SERVER_URL', 'http://localhost') # Get server url from env else default
    server_port = os.getenv('TGIS_SERVER_PORT', '8049') # Get server port from env else default

    inference_server_url=f"{server_url}:{server_port}/"


    tgis_predictor = LangChainLLM(
        llm=HuggingFaceTextGenInference(
            inference_server_url=inference_server_url,
            max_new_tokens=256,
            temperature=temperature,
            repetition_penalty=repetition_penalty,
            server_kwargs={},
        ),
    )

    service_context = ServiceContext.from_defaults(chunk_size=1024, llm=tgis_predictor, 
                                                   embed_model=embed_model,
                                                   query_wrapper_prompt=query_wrapper_prompt)
do you have to set both system prompt and query wrapper?
Do you have the full traceback?
Oh whoops mb
Need to wrap it with a prompt template class
Plain Text
from llama_index.prompts import Prompt

query_wrapper_prompt = Prompt("...")
interestingly, if i added a system prompt as well, it "fixed" it
using Prompt did not fix it.
Plain Text
    query_wrapper_prompt = Prompt("[INST] {query_str} [/INST]")
...
    service_context = ServiceContext.from_defaults(chunk_size=1024, llm=tgis_predictor, 
                                                   query_wrapper_prompt=query_wrapper_prompt,
#                                                   system_prompt=system_prompt,
                                                   embed_model=embed_model)


same error:

Plain Text
UnboundLocalError: local variable 'default_template' referenced before assignment
Do you have the full traceback? I can narrow down the error in the library with the location of the error
Setting it as the system prompt sadly is not the correct solution πŸ₯²
gist inbound (too long for discord)
no i didn't set it AS the system prompt
if I only pass query_prompt_Wrapper i get the error
if i pass query_prompt AND system_prompt when declaring the service_context, I don't get the error
Plain Text
    service_context = ServiceContext.from_defaults(chunk_size=1024, llm=tgis_predictor, 
                                                   query_wrapper_prompt=query_wrapper_prompt,
                                                   system_prompt=system_prompt,
                                                   embed_model=embed_model)

^^^ no error
Plain Text
    service_context = ServiceContext.from_defaults(chunk_size=1024, llm=tgis_predictor, 
                                                   query_wrapper_prompt=query_wrapper_prompt,
                                                   embed_model=embed_model)

error.
I see the issue
https://github.com/jerryjliu/llama_index/blob/644c034a249fa359181f8ebe988b8c2b93401814/llama_index/llm_predictor/base.py#L239

The default template variable is in the scope of another if statement

I'm really surprised mypy or our linting didn't catch that πŸ˜…
Easy fix at least
The latest release has this fix (and the global context fix πŸ’ͺ)
Add a reply
Sign up and join the conversation on Discord