Find answers from the community

Updated last year

Custom llm

L
n
2 comments
How did you setup the LLM? Just set a global service context with your LLM and it should be fine

Plain Text
from llama_index import ServiceContext, set_global_service_context 

service_context = ServiceContext.from_defaults(llm=llm)
set_global_service_context(service_context)
Thank you, it fixed the issue
Add a reply
Sign up and join the conversation on Discord