Hi everyone, I am trying to update an old code that uses
Ollama
since
ServiceContext
is deprecated following these instructions:
https://docs.llamaindex.ai/en/stable/module_guides/supporting_modules/service_context_migration.htmlBut what is the equivalent of embbeding for
Ollama
??
OLD
from llama_index.llms.ollama import Ollama
from llama_index.core import ServiceContext
from llama_index.core.chat_engine import SimpleChatEngine
llm = Ollama(model="mistral")
service_context = ServiceContext.from_defaults(
llm=llm,
embed_model="local:BAAI/bge-small-en-v1.5",
)
chat_engine = SimpleChatEngine.from_defaults(service_context=service_context)
print(chat_engine.chat("Hi can you write a python script to use SimpleChatEngine with llama_index"))
NEW
from llama_index.llms.ollama import Ollama
from llama_index.core import Settings
from llama_index.core.chat_engine import SimpleChatEngine
Settings.llm = Ollama(model="mistral")
chat_engine = SimpleChatEngine.from_defaults(service_context=service_context)
print(chat_engine.chat("Hi can you write a python script to use SimpleChatEngine with llama_index"))