Hi, i see LLMSingleSelector support other LLM, but when i use custome LLM service_context, have issue:
llm = OurLLM()
service_context = ServiceContext.from_defaults(llm=llm,text_splitter=text_splitter,embed_model=embed_model)
# initialize router query engine (single selection, pydantic)
query_engine = RouterQueryEngine(
selector=LLMSingleSelector.from_defaults(service_context=service_context),
query_engine_tools=query_engine_tools,
)
ValueError:
******
Could not load OpenAI model. If you intended to use OpenAI, please check your OPENAI_API_KEY.
Original error:
No API key found for OpenAI.
Please set either the OPENAI_API_KEY environment variable or openai.api_key prior to initialization.
API keys can be found or created at https://platform.openai.com/account/api-keys
To disable the LLM entirely, set llm=None.
******