Find answers from the community

s
F
Y
a
P
Updated 2 months ago

Pydantic validation error when

Pydantic validation error when attempting to subclass ChatResponse from llama_index.core.llms and extend it with a custom pydantic model.

The error is thrown when I add a list of MyModels to ChatResponse via subclassing like so

Plain Text
from pydantic import BaseModel
from llama_index.core.llms import ChatResponse
class MyModel(BaseModel):
  foo: str

class CompletionWithAttributions(ChatResponse):
  attributions: list[MyModel] | None = None


The error is not thrown if I don't subclass from ChatResponse like so
Plain Text
from pydantic import BaseModel
class MyModel(BaseModel):
  foo: str

class CompletionWithAttributions(BaseModel):
  attributions: list[MyModel] | None = None


I can resolve this error by importing BaseModel from llama-index itself like this, but that seems VERY clugy as elsewhere in my app i am importing BaseModel from pydantic as per usual.

Plain Text
from llama_index.core.bridge.pydantic import BaseModel



Anyone experienced anything like this with llama-index? I suspect its due to the handling of pydantic versions within the library.

I am using
llama-index==0.10.51
pydantic==2.6.4
L
3 comments
0.10.x of llama-index is using pydantic.v1
so your subclass should also import from there

from pydantic.v1 import BaseModel
v0.11.x updates to pydantic v2 fully though, heads up
Add a reply
Sign up and join the conversation on Discord