Hey everyone. So I'm using
ollama
along with llamaindex. I followed the tutorial and docs and everything works fine until I try to edit the parameters like
max_new_tokens
. This is the code I'm using:
llm = Ollama( base_url="https://localhost:11434",
model="mistral:instruct",
temperature=0.1,
additional_kwargs={"max_new_tokens": 512},
context_window=3900)
I get the following error:
httpx.ConnectError: [SSL] record layer failure (_ssl.c:1007)
So this wasn't my first pass at trying to change the model's parameters. Originally I ran this code without specifying the
base_url
part. This threw the following error at me:
httpx.HTTPStatusError: Client error '400 Bad Request' for url 'http://localhost:11434/api/chat'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400
So I specified the base_url with the
https
instead of
http
. THis only comes up as an issue if I specify the
additional_kwargs
argument. Does anyone know how I can fix this?