llm = Ollama(model=self._model, base_url=self._base_url, request_timeout=300)
base.py
in the OllamaQueryEnginePack
30s
ollama_pack = OllamaQueryEnginePack(model="dolphin-phi", documents=documents)
response = ollama_pack.run("What is the title of the doc?")
WARNING: Package(s) not found: langchain Name: llama-index Version: 0.9.25.post1 Name: transformers Version: 4.36.2 Name: sentence-transformers Version: 2.2.2 Name: pypdf Version: 3.17.4 Note: you may need to restart the kernel to use updated packages.
python 3.11.5
101 with httpx.Client(timeout=Timeout(self.request_timeout)) as client: --> 102 response = client.post( 103 url=f"{self.base_url}/api/chat", 104 json=payload, 105 )
base.py
can we comment outllm = Ollama(model=self._model, base_url=self._base_url)
llm = MistralAI(api_key=api_key)
ollama serve
runs on boot as it was installed from their official scriptdolphin-phi
its 8GB/16gb