Pydantic validation error when attempting to subclass
ChatResponse
from
llama_index.core.llms
and extend it with a custom pydantic model.
The error is thrown when I add a list of
MyModel
s to
ChatResponse
via subclassing like so
from pydantic import BaseModel
from llama_index.core.llms import ChatResponse
class MyModel(BaseModel):
foo: str
class CompletionWithAttributions(ChatResponse):
attributions: list[MyModel] | None = None
The error is not thrown if I don't subclass from
ChatResponse
like so
from pydantic import BaseModel
class MyModel(BaseModel):
foo: str
class CompletionWithAttributions(BaseModel):
attributions: list[MyModel] | None = None
I can resolve this error by importing
BaseModel
from llama-index itself like this, but that seems VERY clugy as elsewhere in my app i am importing
BaseModel
from pydantic as per usual.
from llama_index.core.bridge.pydantic import BaseModel
Anyone experienced anything like this with llama-index? I suspect its due to the handling of pydantic versions within the library.
I am using
llama-index==0.10.51
pydantic==2.6.4