Find answers from the community

Updated 3 months ago

Hi! I am just checking the docs about

Hi! I am just checking the docs about using Ollama (seem to be obsolete):

Plain Text
# Set Ollama

from llama_index.llms.ollama import Ollama
from llama_index.core import Settings

# Is this syntax below obsolete?
llm = Ollama(model="solar", request_timeout=60.0)
# Is this the new syntax?
Settings.llm = Ollama(model="solar", request_timeout=60.0)


Anyone could clarify when these changes were introduced_ Chroma and Ollama examples seem outdated
L
e
20 comments
they are indeed not outdated πŸ™‚ There should be an install at the top of the example notebooks
For example pip install llama-index-llms-ollama
Oh yes, the pip installs are definitely there
In that example no settings or llm are specified anyway, so it defaults to chatgpt-3.5 turbo
And it just asks for OpenAPI key to be able to execute the index.as_query_index block
So I want to use Ollama instead of chatGPT. I just copy and customize the code:

Plain Text
python 
# Set Ollama

from llama_index.llms.ollama import Ollama
from llama_index.core import Settings

llm = Ollama(model="solar", request_timeout=60.0)
I can query the llm using
Plain Text
# Testing Ollama 
response = llm.complete("What is the Maker Movement?")
print(response)
But when the notebook arrives to the end of the notebook, it asks for the OpenAI key again
Plain Text
from llama_index.core import Settings


Settings.llm = Ollama(model="solar", request_timeout=60.0)
Settings.embed_model = <embed_modle>


This will set global defaults
Or you can override at the component level
Plain Text
index = VectorStoreIndex.from_documents(..., embed_model=embed_model)

query_engine = index.as_query_engine(llm=llm)
The using LLMs tutorial specifies using Settings.llm call, yet the Ollama notebook uses the llm syntax and ChromaDB example omits either Settings.llm or llm
Because the chroma example is just using gpt-3.5 (the default)

The ollama notebook is showing how to create the LLM object, which you can pass into settings as a global default or override at the component level
Ok, I get the point.
I would say it makes the examples a bit confusing. I tried setting llm in the code, but given I was not passing that to the as_query_engine(llm=llm) it still asked for the openai key
Would you consider a PR to update the docs and make the configuration option explicit?
We cant update every notebook example πŸ˜… I would rely on the core docs for understanding configuration

For example
https://docs.llamaindex.ai/en/stable/getting_started/customization.html
https://docs.llamaindex.ai/en/stable/getting_started/starter_example_local.html

We are working on a larger docs refactor -- far less focus on notebook examples, more focus on actual API documentation. Probably out in the next 3 weeks or so
I will try to provide constructive feedback to it
Add a reply
Sign up and join the conversation on Discord