Find answers from the community

Updated 11 months ago

llama_index/docs/examples/multi_modal/ol...

This multi-modal Ollama/Llava example isn't working for me in Colab https://github.com/run-llama/llama_index/blob/main/docs/examples/multi_modal/ollama_multi_modal.ipynb I'm receiving the following connection refused error
Attachment
image.png
A
L
6 comments
I also tried it with a local file and am receiving the error ConnectError: [WinError 10061] No connection could be made because the target machine actively refused it
Do you have an ollama server already running?
@Logan M no in my local file test I just did pip install ollama but didn't actually start a server. In the Colab example it's not using the local server and it gets a connection error in both cases.
Do I need to manually start the server?
Yea -- bad notebook example I guess. Ollama requires that you start a server and pull the model you want to use
Whoops my bad, thanks. Looks like this is what I need https://github.com/ollama/ollama
Add a reply
Sign up and join the conversation on Discord