Find answers from the community

B
Brum
Offline, last seen 3 months ago
Joined September 25, 2024
B
Brum
·

Ollama

Hey all! I run into an issue that I am unable to find anywhere: when using Ollama locally using the query engine 'query' gives me a 'localhost:11434/API/chat' error (not found 404). It seems like Ollama does not have that endpoint at all?

The weird thing is that this 2 weeks ago was working just fine. Any help here?
7 comments
W
C
B
Hey, I seen on the readme that LamaIndex ts library supports llama2 models, is there a guide how to use it? Id like to hook it up to the local ollama instance
22 comments
B
d
S
L