# install Llama-index Huggingface lib pip install llama-index-embeddings-huggingface # now load the model Settings.embed_model = "local:BAAI/bge-base-en-v1.5"
Settings.llm=llm
and the error will go awaySettings.
will override their local/abstraction/object based definition i.e. DocumentSummaryIndex.from_documents(documents, llm=llm,)
, right?