Find answers from the community

M
MS
Offline, last seen 3 months ago
Joined September 25, 2024
M
MS
·

Timeout

Global settings

from llama_index.core import Settings
from llama_index.llms.ollama import Ollama

Settings.llm = Ollama(model="mistral", request_timeout=60.0)

Local settings

index.as_query_engine(llm=Ollama(model="mistral", request_timeout=60.0))
1 comment
L
M
MS
·

Settings

In below code, why we need to apply sentencesplitter transformation when we already set Settings.chunk_size as some value. Code is taken from official documentation - # Global settings
from llama_index.core import Settings

Settings.chunk_size = 512

Local settings

from llama_index.core.node_parser import SentenceSplitter

index = VectorStoreIndex.from_documents(
documents, transformations=[SentenceSplitter(chunk_size=512)]
)
1 comment
L