Find answers from the community

Updated 2 months ago

Best alternative for deprecated query engine

Hey guys, i see this is deprecated. What is the best alternative now?

https://docs.llamaindex.ai/en/stable/module_guides/querying/structured_outputs/query_engine/
L
N
12 comments
Plain Text
sllm = llm.as_structured_llm(output_cls)

index.as_query_engine(llm=sllm)


Basically attaching an output class to the LLM will (usually) force it to use that as the output
Aaahh okay cool
It seems like this one is incompatible with Pydantic v1 models?
Plain Text
pydantic_core._pydantic_core.ValidationError: 1 validation error for StructuredLLM
worker-1                | output_cls
worker-1                |   Input should be a subclass of BaseModel [type=is_subclass_of, input_value=<class 'models.MyCustomModel'>, input_type=ModelMetaclass]


Plain Text
from pydantic.v1 import BaseModel as BaseModelV1
from pydantic.v1 import Field as FieldV1

class MyCustomModel(BaseModelV1):
    test: str = FieldV1(..., description="Test")
oh we moved out of pydantic.v1 in v0.11.x
was a huge lift πŸ˜…
use normal pydantic imports in latest versions
I remember my colleague created a ticket for the issue with pydantic. Happy we can finally move off of v1
Is the OpenAIPydanticProgram still the way to do this if you dont want to query an index?
(The documentation suggests so I guess?)
That's one way to do it yea

as_structured_llm is probably what I would do in 99% of cases though (in this case, its probably calling OpenAIPydanticProgram under the hood)
Add a reply
Sign up and join the conversation on Discord