Find answers from the community

y
yata
Offline, last seen 4 months ago
Joined September 25, 2024
Hi everyone, I am trying to update an old code that uses Ollama since ServiceContext is deprecated following these instructions: https://docs.llamaindex.ai/en/stable/module_guides/supporting_modules/service_context_migration.html
But what is the equivalent of embbeding for Ollama ??



OLD
Plain Text
from llama_index.llms.ollama import Ollama
from llama_index.core import ServiceContext
from llama_index.core.chat_engine import SimpleChatEngine

llm = Ollama(model="mistral")
service_context = ServiceContext.from_defaults(
    llm=llm,
    embed_model="local:BAAI/bge-small-en-v1.5", 
)

chat_engine = SimpleChatEngine.from_defaults(service_context=service_context)
print(chat_engine.chat("Hi can you write a python script to use SimpleChatEngine with llama_index"))

NEW
Plain Text
from llama_index.llms.ollama import Ollama
from llama_index.core import Settings
from llama_index.core.chat_engine import SimpleChatEngine

Settings.llm = Ollama(model="mistral")

chat_engine = SimpleChatEngine.from_defaults(service_context=service_context)
print(chat_engine.chat("Hi can you write a python script to use SimpleChatEngine with llama_index"))
14 comments
W
y