Find answers from the community

Updated 3 months ago

Ollama

Hey all! I run into an issue that I am unable to find anywhere: when using Ollama locally using the query engine 'query' gives me a 'localhost:11434/API/chat' error (not found 404). It seems like Ollama does not have that endpoint at all?

The weird thing is that this 2 weeks ago was working just fine. Any help here?
W
B
C
7 comments
Checked the ollama GitHub: API endpoint mentioned there is in small caps

https://github.com/jmorganca/ollama#rest-api
Thank you! Forgot to come back to you, it turned out that something changed in both ollama and llamaindex in the 2 weeks I did not fired them up. Running python required a pip install but ollama not. Updating ollama explicitly solved the issue.
ah nevermind, i need to set request_timeout it seems
Great! Thank you For the contribution @Cristian Paul , Merged already πŸ‘
Add a reply
Sign up and join the conversation on Discord