Log in
Log into community
Find answers from the community
View all posts
Related posts
Was this helpful?
π
π
π
Powered by
Hall
Inactive
Updated 4 months ago
0
Follow
Changing llama to use ollama in create-llama project
Changing llama to use ollama in create-llama project
Inactive
0
Follow
At a glance
5
532914
4 months ago
Β·
Hi Everyone, In npx create-llama how do I change the LLM to use ollama ?
5
L
3 comments
Share
Open in Discord
5
532914
4 months ago
I've changed my .env to
LLAMA_CLOUD_API_KEY=
MODEL_PROVIDER=ollama
MODEL=llama3 # or anoth
MODEL=llama2
EMBEDDING_MODEL=mxbai-embed-large
EMBEDDING_DIM=1024
OLLAMA_BASE_URL=http://127.0.0.1:11434 # Adjust to the Ollama instance URL if it's running elsewhere
Still getting the pop up fo the openapi key though
L
Logan M
edited 4 months ago
You'll have to edit actual source code in the backend code to load ollama llm and embeddings
L
Logan M
4 months ago
I forget where it is off the top of my head (depends on if you are using python or typescript backend)
Add a reply
Sign up and join the conversation on Discord
Join on Discord