Find answers from the community

Updated 3 months ago

Hi, i see LLMSingleSelector support

Hi, i see LLMSingleSelector support other LLM, but when i use custome LLM service_context, have issue:
Plain Text
llm = OurLLM()
service_context = ServiceContext.from_defaults(llm=llm,text_splitter=text_splitter,embed_model=embed_model)
# initialize router query engine (single selection, pydantic)
query_engine = RouterQueryEngine(
    selector=LLMSingleSelector.from_defaults(service_context=service_context),
    query_engine_tools=query_engine_tools,
)
ValueError: 
******
Could not load OpenAI model. If you intended to use OpenAI, please check your OPENAI_API_KEY.
Original error:
No API key found for OpenAI.
Please set either the OPENAI_API_KEY environment variable or openai.api_key prior to initialization.
API keys can be found or created at https://platform.openai.com/account/api-keys

To disable the LLM entirely, set llm=None.
******
s
L
2 comments
how's your emb_model defined? is it using openai?
you'll need to pass the service context into RouterQueryEngine itself as well
Add a reply
Sign up and join the conversation on Discord