Find answers from the community

Updated 11 months ago

is set_global_service_context still

At a glance
is set_global_service_context still required? when i tried swapping to a local LLM it didnt like (wanted Settings.embed_model) the call with a service context on the vector store, but now after swapping back to azureopenai it requires the service context, im not sure i understand why there is a difference
W
R
L
7 comments
No it is not required if you on v0.10.x, You can define llm and embed_model and add it to Settings and it should be fine.
i tried this but i keep getting an error about azure open-ai not being setup/authenticated etc when i dont add the

part of the VectorStoreIndex.from_documents(service_context=service_context) if i use directly Settings.llm and Settings.embed_model, im using 0.10.11
Did you upgraded from 0.9 to 0.10? with pip install upgrade??
Can you give a complete code sample? Its been working for other people just fine πŸ‘€
I cant give the full complete example now because i uninstalled the entire llamaindex env and reinstalled it and it just worked.... but basically it was simply i upgraded from as you said 0.9 to 0.10 and had to change all of the deprecated stuff, then i had:

Settings.llm = AzureOpenAI(
model="gpt-4",
deployment_name="gpt-4",
api_key=api_key,
azure_endpoint=azure_endpoint,
api_version=api_version,
)
Settings.embed_model = HuggingFaceEmbedding(
model_name="BAAI/bge-small-en-v1.5"
)

this didnt work when using the vectorstore like this:

index = VectorStoreIndex.from_documents(documents)

it said there was no credentials etc thenless i had this somewhere:

service_context = ServiceContext.from_defaults(
llm=llm,
embed_model="local:BAAI/bge-small-en-v1.5",
chunk_size=1024
embed_model=embed_model,
)
set_global_service_context(service_context)

but after completely removing everything again and reinstalling with pip it worked without the service_context...so i dont know what to say
thanks for trying to help anyway, im just assuming it was another pip issue, for like the 3rd time in the last month
If there is a mixup of legacy and new code plus if old llama-index is not properly removed then you are going to face certain issues.

new env is recommended
Add a reply
Sign up and join the conversation on Discord