Find answers from the community

Home
Members
mitjafelicijan
m
mitjafelicijan
Offline, last seen 3 months ago
Joined September 25, 2024
Hi. A quick question about using local Ollama with Mistral model when having local storage. On the docs page, I only saw the usage with OpenAPI api keys.

I am, however, wondering how I would make this work with a local running instance of Ollama that runs MistalAI model.

The code below works and gives me the results so the setup is correctly set up.

Plain Text
llm = Ollama(model="mistral", request_timeout=30.0)
resp = llm.complete("Who is Paul Graham?")               print(resp)


A storage folder was created and there are files there, so that worked properly as well.

What am I missing here? I have been reading documentation and I haven't found an example for this. I may very well be missing something or have missed something in the docs.

Does anybody know how to achieve this?
2 comments
m
L