Find answers from the community

Updated 2 months ago

is it possible to pass Pydantic

is it possible to pass Pydantic BaseModel as a tool input? it seems it is not supported as per the error
Plain Text
    raise RuntimeError(f'no validator found for {type_}, see `arbitrary_types_allowed` in Config')
RuntimeError: no validator found for <clas
3
k
L
16 comments
@kapa.ai how to pass Pydantic BaseModel as tool input?
Let me check the knowledge base for you. One moment! :books:
----------------------
To pass a Pydantic BaseModel as tool input in LlamaIndex, you can use the FunctionTool class from llama_index.core.tools. Here's a step-by-step guide:

  1. Define your Pydantic BaseModel. For example, a Song model:
Plain Text
from pydantic import BaseModel

class Song(BaseModel):
    """A song with name and artist"""
    name: str
    artist: str


  1. Define a function that takes the fields of your BaseModel as arguments and returns an instance of your BaseModel. For example, a generate_song function:
Plain Text
def generate_song(name: str, artist: str) -> Song:
    """Generates a song with provided name and artist."""
    return Song(name=name, artist=artist)


  1. Create a FunctionTool from your function using the from_defaults method:
Plain Text
from llama_index.core.tools import FunctionTool

tool = FunctionTool.from_defaults(fn=generate_song)


  1. Use the predict_and_call method of your LLM (either Upstage or OpenAI) to call the tool with your input:
Plain Text
from llama_index.llms.upstage import Upstage

llm = Upstage()
response = llm.predict_and_call([tool], "Generate a song")


In this example, "Generate a song" is the input to the LLM, which will be passed to the generate_song function. The function will return a Song object, which is an instance of the Pydantic BaseModel.

You can find more details in the LlamaIndex documentation () and GitHub examples ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
I don't see why not. Might need to see the full error though
This is the error:
Plain Text
    raise RuntimeError(f'no validator found for {type_}, see `arbitrary_types_allowed` in Config')
RuntimeError: no validator found for <class 'external_data_loader.GithubRepoItem'>, see `arbitrary_types_allowed` in Config
and function tool is specified as follows:
Plain Text
def chat(self, repoItem: GithubRepoItem, query: str) -> AgentResponse:
@Logan M it seems if i use the create_model from Pydantic it works, somehow seems related to the version used by llamaindex for create_model doesn't support
looking at the code more, seems like it is about pydantic.v1 which errors out.
Plain Text
try:
    import pydantic.v1 as pydantic
    from pydantic.v1 import (
        BaseConfig,
        BaseModel,
        Field,
        PrivateAttr,
        StrictFloat,
        StrictInt,
        StrictStr,
        create_model,
        root_validator,
        validator,
        parse_obj_as,
    )
    from pydantic.v1.error_wrappers import ValidationError
    from pydantic.v1.fields import FieldInfo
    from pydantic.v1.generics import GenericModel
except ImportError:
    import pydantic  # type: ignore
    from pydantic import (
        BaseConfig,
        BaseModel,
        Field,
        PrivateAttr,
        StrictFloat,
        StrictInt,
        StrictStr,
        create_model,
        root_validator,
        validator,
        parse_obj_as,
    )
    from pydantic.error_wrappers import ValidationError
    from pydantic.fields import FieldInfo
    from pydantic.generics import GenericModel
i am not sure why pydantic.v1 gets priority over pydantic here.
ok so if i use pydantic.v1 to create the BaseModel , it works. seems like a compatibility issue. i think the recommendation here should be to use from llama_index.core.bridge.pydantic import BaseModel, Field whenver constructing inputs instead of using from pydantic import BaseModel
Pydantic.v1 gets priority because at least right now, we support both pydantic v1 and v2 (through the compatibility layer)
If we don't import from .v1, a lot of things break
Hoping to make a full migration soon. But for now, use the v1 layer
thanks. so it seems like constructing pydantic BaseModel from v2 does not work with create_model from v1.
for now as i said, i am using from llama_index.core.bridge.pydantic import BaseModel, Field while defined the tool inputs in my codebase
Add a reply
Sign up and join the conversation on Discord