Find answers from the community

Updated 3 months ago

Hello guys, im trying to complement

Hello guys, im trying to complement ollama model mistral with llama-index, but im having this error when i tried to run a simple script : from llama_index.llms import Ollama

llm = Ollama(model='mistral')

resp = llm.complete("What did Rome grow? Be concise.")
print(resp)
L
S
7 comments
what is the error?
increase the request timeout (probably, we should increase the default)
Plain Text
llm = Ollama(model='mistral', request_timeout=200)
Thanks i will prove it
im running it another time with the change
It works thanks you really πŸ™‚
Attachment
image.png
Add a reply
Sign up and join the conversation on Discord