Find answers from the community

Updated 2 months ago

Hey y'all, trying to add my function

Hey y'all, trying to add my function parameters from my pydantic output cls to prompt foo (https://github.com/promptfoo/promptfoo/blob/main/examples/openai-function-call/promptfooconfig.yaml
)

doesn't seem like they support tools yet so my question is what's the easiest way to get my pdantic output cls in the correct functions format to just paste it in? I tried openai_fn_spec = to_openai_tool(self._output_cls) but not sooooo helpful.
L
b
69 comments
Unless I'm reading this wrong, you could swipe the to_openai_tool(...)['function'] output and use name, description, and parameters from there? Although might need some light massaging from there to fit the promptfoo format
openai_fn_spec = to_openai_tool(self._output_cls)
printing this somehow incldues
Plain Text
"items": {
            "$ref": "#/definitions/Contact"
          }

which I thought was pydantic stuff?
true true, need some helper function to clean up/transform the output lol
does $ref work in tools in openai?
i can't imagine πŸ™‚
so I tried massaging it but just kind of dropping the Contact function into items, no bueno haha
When we send tools to openai, we use

Plain Text
from llama_index.llms.openai_utils import to_openai_tool

to_openai_tool(pydantic_class)
yeah I printed that
it comes out with $ref ??!?
yep thats where I printed
above attachment is the result-ish
i'll keep working on this
Yea that output is exactly what we give the openai client it seems
wonder if i'm printing it if it's different than what gets sent?
don't think so
i tried just copying the definition into items
no bueno ahha
it's like a nested function which is why I think this is problematic
maybe I'll use llama index old to_openai_function call
probably the solution here, was going to suggest that next haha
Plain Text
def to_openai_function(pydantic_class: Type[BaseModel]) -> Dict[str, Any]:
    """Convert pydantic class to OpenAI function."""
    schema = pydantic_class.schema()
    return {
        "name": schema["title"],
        "description": schema["description"],
        "parameters": pydantic_class.schema(),
    }
it's the same code WTF
return {"type": "function", "function": function}
only difference
how is this working
it works for openai πŸ˜…
I think we just need a dedicated function to conform to whatever input promptfoo wants
well, I think they want just what openai takes
well, apparently not tho πŸ˜†
openai complains when I send in $ref
hold on let me retry.
Are you sure you used the proper function? Just tried this

Plain Text
>>> from llama_index.llms.openai_utils import to_openai_tool
>>> from pydantic import BaseModel
>>> class Test(BaseModel):
...     """This is a test"""
...     name: str = "Logan"
... 
>>> to_openai_tool(Test)
{'type': 'function', 'function': {'name': 'Test', 'description': 'This is a test', 'parameters': {'title': 'Test', 'description': 'This is a test', 'type': 'object', 'properties': {'name': {'title': 'Name', 'default': 'Logan', 'type': 'string'}}}}}
>>> 
... so weird I'm literally doing print(to_openai_tool(self._output_cls)
let me try in the python repl
for reference I have pydantic 1.10.12 at the moment πŸ€”
is there easy way to tell pydantic v
pip show pydantic
Plain Text
╰─ $ pip show pydantic
Name: pydantic
Version: 1.10.12
weird right?
@Logan M it's when its a nested function
try to mae your TestModel include another one
Plain Text
class ContactList(BaseModel):
    """
    Provides a list of contacts.
    Return empty list [] if not confident. It Is important to not hallicunate any of the data. email, phone, first name, last name.
    """

    contacts: Optional[List[Contact]] = Field(
        description="A list of contact information", default=""
    )
hmm ok one sec
Hmm yea I see it then, it has the #ref -- let me try sending this to openai lol
lol it works
thought that'd be the case
Plain Text
>>> to_openai_tool(Test2)
{'type': 'function', 'function': {'name': 'Test2', 'description': 'Nested thing', 'parameters': {'title': 'Test2', 'description': 'Nested thing', 'type': 'object', 'properties': {'prev': {'title': 'Prev', 'type': 'array', 'items': {'$ref': '#/definitions/Test'}}, 'name_2': {'title': 'Name 2', 'default': 'bmax', 'type': 'string'}}, 'definitions': {'Test': {'title': 'Test', 'description': 'This is a test', 'type': 'object', 'properties': {'name': {'title': 'Name', 'default': 'Logan', 'type': 'string'}}}}}}}
>>> from llama_index.llms import OpenAI
>>> llm = OpenAI()
>>> openai_fn_spec = to_openai_tool(Test2)
>>> from llama_index.llms import OpenAI, ChatMessage
>>> chat_response = llm.chat(messages=[ChatMessage(content="Hello!", role="user")], tools=[openai_fn_spec])
>>> 
yeah obviously has to work haha
on to debug promptfoo then
just a sanity check haha
i guess json $ref is a thing?
which i didn't even know...
yea idek either ha
k the ref was failing
just replaced the ref
I GOT IT!!
Add a reply
Sign up and join the conversation on Discord