Find answers from the community

Updated 7 months ago

API/chat

Please Help me on this . I am on mac platform .
I got this error when I followed : https://docs.llamaindex.ai/en/stable/getting_started/starter_example_local/.
This is my code :
Plain Text
python 
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader, Settings
from llama_index.embeddings.ollama import OllamaEmbedding
from llama_index.llms.ollama import Ollama

documents = SimpleDirectoryReader("data").load_data()

# nomic embedding model
Settings.embed_model = OllamaEmbedding(model_name="nomic-embed-text")

# ollama
Settings.llm = Ollama(model="llama3", request_timeout=360.0)

index = VectorStoreIndex.from_documents(
    documents,
)

query_engine = index.as_query_engine()
response = query_engine.query("What did the author do growing up?")
print(response)

HTTPStatusError: Client error '404 Not Found' for url 'http://localhost:11434/api/chat'
I have tried request with curl , it's ok .
And I also manually request in python like this , and it's also ok .
Plain Text
python 
import httpx

url = "http://localhost:11434/api/chat"
data = {
    "model": "llama3:instruct",
    "messages": [
        {"role": "user", "content": "why is the sky blue?"}
    ]
}

response = httpx.post(url, json=data)

print(response.text)

I am really confused on it . May anybody help me on the problem ? What's wrong here ?
e
p
3 comments
Hi, you try acces on http://localhost:11434/api/chat using browser?
No. Use curl in command line. And httpx in code
Thank you anyway. I have figured it out by myself. Model name is set wrong in my code. I changed it from llama to llama:instruct. And it works.
Add a reply
Sign up and join the conversation on Discord