A community member is experiencing a strange connection error when trying to use the mistral llm. They have provided a code snippet that imports the Ollama class from the llama_index.llms module and initializes an instance with the "mistral" model and a request timeout of 30 seconds. The code then attempts to complete the prompt "Who is Usain Bolt?" and print the response.
In the comments, another community member suggests that the issue may be related to the ollama server not running. The original poster acknowledges this and thanks the commenter for the help.
Getting a strange connection error after trying to do a quick test with the mistral llm
My code is literally just from llama_index.llms import Ollama llm = Ollama(model="mistral", request_timeout=30.0) resp = llm.complete("Who is Usain Bolt?") print(resp)