Find answers from the community

Updated last year

Hello

Hello,

Since yesterday I got an error with llama_index and llama_cpp to load llama-2 13B model

Plain Text
/usr/local/lib/python3.10/dist-packages/llama_index/indices/service_context.py in from_defaults(cls, llm_predictor, llm, prompt_helper, embed_model, node_parser, llama_logger, callback_manager, system_prompt, query_wrapper_prompt, chunk_size, chunk_overlap, context_window, num_output, chunk_size_limit)
    153                 raise ValueError("Cannot specify both llm and llm_predictor")
    154             llm = resolve_llm(llm)
--> 155         llm_predictor = llm_predictor or LLMPredictor(llm=llm)
    156         if isinstance(llm_predictor, LLMPredictor):
    157             llm_predictor.llm.callback_manager = callback_manager

/usr/local/lib/python3.10/dist-packages/llama_index/llm_predictor/base.py in __init__(self, llm, callback_manager, system_prompt, query_wrapper_prompt)
     93     ) -> None:
     94         """Initialize params."""
---> 95         self._llm = resolve_llm(llm)
     96 
     97         if callback_manager:
L
1 comment
What llama-cpp-python version do you have? Try installing 0.1.78 or less, I need to upate the source code to better handle these versions
Add a reply
Sign up and join the conversation on Discord