Find answers from the community

Updated 3 months ago

Hey everyone. So I'm using `ollama`

Hey everyone. So I'm using ollama along with llamaindex. I followed the tutorial and docs and everything works fine until I try to edit the parameters like max_new_tokens. This is the code I'm using:

Plain Text
llm = Ollama(         base_url="https://localhost:11434",
                      model="mistral:instruct",
                      temperature=0.1, 
                      additional_kwargs={"max_new_tokens": 512},
                      context_window=3900)  


I get the following error:

Plain Text
httpx.ConnectError: [SSL] record layer failure (_ssl.c:1007)


So this wasn't my first pass at trying to change the model's parameters. Originally I ran this code without specifying the base_url part. This threw the following error at me:

Plain Text
httpx.HTTPStatusError: Client error '400 Bad Request' for url 'http://localhost:11434/api/chat'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400


So I specified the base_url with the https instead of http. THis only comes up as an issue if I specify the additional_kwargs argument. Does anyone know how I can fix this?
L
N
12 comments
yea https isn't the right URL
Do you have the latest version of ollama running?
hmm, I was just using ollama yesterday and it worked fine
Can you try without the additional kwargs?
Okay. I went back to http, and I also scrubbed out max_new_tokens and replaced it with num_predict. This seems to have fixed it I think? It's running now
ah perfect haha
yea the original error meant something was wrong with the params you sent πŸ‘
do you know where I can read up on what exactly I can pass to additional_kwargs?
Also thank you very very much for your help!!!
Man, this is so helpful! Thank you so much! I really appreciate it!!
Add a reply
Sign up and join the conversation on Discord