Find answers from the community

Updated 2 months ago

Hey, I was trying a project on a

Hey, I was trying a project on a different computer, and now I can't use Query Engine with Pydantic Outputs
Here's my code :
Plain Text
class List(BaseModel):
    """
    List of items.
    """
    items: list[str]

vector_store = RedisVectorStore(
    custom_schema={} # some complex schema I skip
    redis_url=f"redis://{REDIS_HOST}:{REDIS_PORT}",
)

llm = OpenAI(model="gpt-3.5-turbo-0125", temperature=0.0, api_key=OPENAI_API_KEY)
embeddings_model = OpenAIEmbedding(api_key=OPENAI_API_KEY)

Settings.embed_model = embeddings_model
Settings.llm = llm

index = VectorStoreIndex.from_vector_store(vector_store=vector_store)
pydantic_query_engine = index.as_query_engine(output_cls=List, response_mode=ResponseMode.TREE_SUMMARIZE)
query: PydanticResponse = pydantic_query_engine.query("List 5 keywords describing the text.")
print(query.response.json())
It works perfectly well with classic queries πŸ€”
L
A
12 comments
Is there an error?
what version of things do you have? It seems to work fine for me on latest
How did you import BaseModel?
This example working for me

Plain Text
from llama_index.core import VectorStoreIndex, Document, MockEmbedding

index = VectorStoreIndex.from_documents([Document.example()], embed_model=MockEmbedding(embed_dim=256))

from pydantic.v1 import BaseModel

class Fact(BaseModel):
    """A fact about the provided text."""
    fact: str

query_engine = index.as_query_engine(output_cls=Fact, llm=llm)

response = query_engine.query("What is this document about?")
print(response.fact)
AH, it only works with pydantic v1 ??
I was using pydantic 2
It should be precised on the documentation !
I think all the docs import from llama_index.core.bridge.pydantic import BaseModel, ... when pydantic is used πŸ˜…
Moving to pydantic v2 in the next few weeks!
Ah I see, you still should mention it on the docs :')
Add a reply
Sign up and join the conversation on Discord