Find answers from the community

Updated last year

Getting a strange connection error after

At a glance

A community member is experiencing a strange connection error when trying to use the mistral llm. They have provided a code snippet that imports the Ollama class from the llama_index.llms module and initializes an instance with the "mistral" model and a request timeout of 30 seconds. The code then attempts to complete the prompt "Who is Usain Bolt?" and print the response.

In the comments, another community member suggests that the issue may be related to the ollama server not running. The original poster acknowledges this and thanks the commenter for the help.

Getting a strange connection error after trying to do a quick test with the mistral llm

My code is literally just
from llama_index.llms import Ollama
llm = Ollama(model="mistral", request_timeout=30.0)
resp = llm.complete("Who is Usain Bolt?")
print(resp)

Any ideas on what could be causing it?
Attachment
image.png
L
C
2 comments
Is your ollama server running?
Can't believe I made that much of a noob mistakes. Thanks a bunch
Add a reply
Sign up and join the conversation on Discord