Find answers from the community

Updated last year

While in the second case where I am

At a glance

The community member is having an issue with the QueryEngineRouter where it fails to work, while the same code works in another case. The community members discuss the differences in the service_context initialization between the two cases, specifically the llm parameter. It is noted that the PydanticSingleSelector requires the llm as a keyword argument, which is not being picked up from the global service_context. The community members suggest that this is a known issue and that the upcoming version 0.10 of the library will remove the service_context to address this problem.

Useful resources
While in the second case where I am trying to use QueryEngineRouter, it fails. Below is the code for the same.

Plain Text
# construct list_index and vector_index from storage_context and service_context
list_index = ListIndex(nodes, service_context=service_context)
vector_index = VectorStoreIndex(nodes, service_context=service_context)

# define list_query_engine and vector_query_engine
list_query_engine = list_index.as_query_engine(
    response_mode="tree_summarize",
    use_async=True,
)
vector_query_engine = vector_index.as_query_engine()

list_tool = QueryEngineTool.from_defaults(
    query_engine=list_query_engine,
    description="Useful for summarization questions related to the data source",
)
vector_tool = QueryEngineTool.from_defaults(
    query_engine=vector_query_engine,
    description="Useful for retrieving specific context related to the data source",
)

# construct RouterQueryEngine
query_engine = RouterQueryEngine(
    # selector=LLMSingleSelector.from_defaults(),
    selector=PydanticSingleSelector.from_defaults(),
    query_engine_tools=[
        list_tool,
        vector_tool,
    ],
)
response_str = query_engine.query(
    "What is the maximum quantity that can be submitted for a First Leg Order in trading?"
).response

And the http log gives 404 Not Found.
Plain Text
HTTP Request: POST https://visdam-labs.openai.azure.com/chat/completions "HTTP/1.1 404 Resource Not Found"


Please let me know if I am doing something wrong here. Thank you for looking into it.
b
L
d
17 comments
your service context is the same for both?
Plain Text
query_engine = RouterQueryEngine(
    # selector=LLMSingleSelector.from_defaults(),
    selector=PydanticSingleSelector.from_defaults(llm=service_context.llm),
    query_engine_tools=[
        list_tool,
        vector_tool,
    ],
    service_context=service_context
)
should probably help
For the first (successful case)
Plain Text
# Initialize Service Context
service_context = ServiceContext.from_defaults(
    llm=llm,
    embed_model=embed_model,
    chunk_size=512,
    chunk_overlap=20,
    callback_manager=callback_manager,
)
set_global_service_context(service_context)


While for the second (failure case):
Plain Text
# Initialize Service Context
service_context = ServiceContext.from_defaults(
    llm=llm,
    embed_model=embed_model,
    chunk_size=512,
    chunk_overlap=20,
)
set_global_service_context(service_context)
sorry I guess I should ask more specifically the llm
Yea, since the selector needs an LLM as a kwarg, it's not picking it up from the global service context you set
(Super annoying I know, v0.10 is going to remove the service context I think LOL)
Yes! This helped. πŸ‘πŸ» Shouldn't by default PydanticSingleSelector.from_defaults() method pick the llm from the global service_context?
its a long story, but no πŸ™‚
Once we remove the service context, it should make a lot more sense
its a weird limbo state at the moment
Understood. And thanks again. Would be great if you can share a document if we have any.
I don't think I have a document on this -- but just looking at kwargs, the pydantic selector doesn't take in a service context (and therefore, implies it doesn't interact with it), just an LLM as a kwarg

If you ever see llm as a kwarg to an interface, you can assume it does not interact with the service context
No worries. Thank you so much for the explanation.
Just to add – If I use LLMSingleSelector, I don't need to pass the LLM object in the function. It works perfectly fine.
That makes sense
Attachment
image.png
Add a reply
Sign up and join the conversation on Discord