Find answers from the community

Updated 3 months ago

Changing llama to use ollama in create-llama project

At a glance
Hi Everyone, In npx create-llama how do I change the LLM to use ollama ?
5
L
3 comments
I've changed my .env to

LLAMA_CLOUD_API_KEY=


MODEL_PROVIDER=ollama
MODEL=llama3 # or anoth
MODEL=llama2


EMBEDDING_MODEL=mxbai-embed-large

EMBEDDING_DIM=1024

OLLAMA_BASE_URL=http://127.0.0.1:11434 # Adjust to the Ollama instance URL if it's running elsewhere

Still getting the pop up fo the openapi key though
You'll have to edit actual source code in the backend code to load ollama llm and embeddings
I forget where it is off the top of my head (depends on if you are using python or typescript backend)
Add a reply
Sign up and join the conversation on Discord