Find answers from the community

s
F
Y
a
P
Updated last month

The llamapack code just needs to be

The llamapack code just needs to be updated.

Does anyone no how to Download the pack and edit the code…

Plain Text
llm = Ollama(model=self_model, base_url=self._base_url)
Attachment
image.png
L
m
17 comments
I actually updated the pack! But I guess I need to make a release of llama-hub for it to be available lol

You actually already downlaoded the code! in ./ollama_pack

You could edit and then do from ollama_pack import OllamaQueryEnginePack
Plain Text
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[7], line 2
      1 # You can use any llama-hub loader ooto get documents!
----> 2 ollama_pack = OllamaQueryEnginePack(model="phi", documents=documents)

File ~/work/SPEED/ollama_pack/base.py:27, in OllamaQueryEnginePack.__init__(self, model, base_url, documents)
     23 self._base_url = base_url
     25 llm = Ollama(model=self._model, base_url=self._base_url)
---> 27 embed_model = OllamaEmbedding(model_name=self._model, base_url=self._base_url)
     29 service_context = ServiceContext.from_defaults(llm=llm, embed_model=embed_model)
     31 self.llm = llm

File ~/work/SPEED/ollama_pack/base.py:69, in OllamaEmbedding.__init__(self, model_name, base_url, verbose, **kwargs)
     57 def __init__(
     58     self,
     59     model_name: str,
   (...)
     62     **kwargs: Any,
     63 ) -> None:
     64     super().__init__(
     65         model_name=model_name,
     66         **kwargs,
     67     )
---> 69     self._verbose = verbose
     70     self._base_url = base_url

File /opt/conda/lib/python3.11/site-packages/pydantic/v1/main.py:357, in BaseModel.__setattr__(self, name, value)
    354     return object_setattr(self, name, value)
    356 if self.__config__.extra is not Extra.allow and name not in self.__fields__:
--> 357     raise ValueError(f'"{self.__class__.__name__}" object has no field "{name}"')
    358 elif not self.__config__.allow_mutation or self.__config__.frozen:
    359     raise TypeError(f'"{self.__class__.__name__}" is immutable and does not support item assignment')

ValueError: "OllamaEmbedding" object has no field "_verbose"



Yikes @Logan M
Attachment
image.png
i realize the second error required the same fix but, im not sure if it keeps redownloading when i run the cell.
It probably does redownload, no need to run the download_llama_pack() function more than once
Not sure where the verbose thing is coming from, I don't see it in the latest version of the llama-index source code
does that mean your update is missing this verbose issue ? 😁
!pip install llama-index transformers sentence-transformers pypdf --quiet

will running this get the latest
!pip install --upgrade ...
doing that will fix the verbose thing (although you might have to restart your notebook after updating)
did restart kernel and showing Name: llama-index Version: 0.9.19

Plain Text
        llm = Ollama(model=self._model, base_url=self._base_url)

        embed_model = OllamaEmbedding(model_name=self._model, base_url=self._base_url)
they both llm and embed_model should be same i assume from your previous fix suggestions for the llm variable.
I also just edited the base.py

and ran only

ollama_pack = OllamaQueryEnginePack(model="phi", documents=documents)

but getting the same old error. its as if it doesnt see the change i made to base.py

i run

Plain Text
from llama_index.llama_pack import download_llama_pack

# download and install dependencies,  comment if already downloaded
#OllamaQueryEnginePack = download_llama_pack("OllamaQueryEnginePack", "./ollama_pack")


after editng the base.py
do you want me to send you my notebook file... its pretty straightforward stuff i put together from the llama hub example...
Try running outside of a notebook first -- it works fine for me locally
I suspect the old package is cached or something silly by the notebook
ok i will run it when i get home. and give you debug test.

please look out for me bro.

this is probaly the best rag document python q/a i seen and i been looking for a simple but llocal llm based solution for awhile....
Add a reply
Sign up and join the conversation on Discord