Find answers from the community

Updated last year

llama_index/docs/examples/multi_modal/ol...

At a glance

The community member is having trouble running a multi-modal Ollama/Llava example in Colab, receiving a connection refused error. They also tried running it locally and received a similar error. The comments indicate that Ollama requires starting a server and pulling the model, which was not done in the Colab example. The community members suggest that the Colab example is a "bad notebook" and that the community member should refer to the Ollama repository for the proper setup instructions.

Useful resources
This multi-modal Ollama/Llava example isn't working for me in Colab https://github.com/run-llama/llama_index/blob/main/docs/examples/multi_modal/ollama_multi_modal.ipynb I'm receiving the following connection refused error
Attachment
image.png
A
L
6 comments
I also tried it with a local file and am receiving the error ConnectError: [WinError 10061] No connection could be made because the target machine actively refused it
Do you have an ollama server already running?
@Logan M no in my local file test I just did pip install ollama but didn't actually start a server. In the Colab example it's not using the local server and it gets a connection error in both cases.
Do I need to manually start the server?
Yea -- bad notebook example I guess. Ollama requires that you start a server and pull the model you want to use
Whoops my bad, thanks. Looks like this is what I need https://github.com/ollama/ollama
Add a reply
Sign up and join the conversation on Discord