Hey all! I run into an issue that I am unable to find anywhere: when using Ollama locally using the query engine 'query' gives me a 'localhost:11434/API/chat' error (not found 404). It seems like Ollama does not have that endpoint at all?
The weird thing is that this 2 weeks ago was working just fine. Any help here?
Thank you! Forgot to come back to you, it turned out that something changed in both ollama and llamaindex in the 2 weeks I did not fired them up. Running python required a pip install but ollama not. Updating ollama explicitly solved the issue.