Find answers from the community

Updated 8 months ago

Classmethods

is FunctionTool.from_defaults() meant to be able to handle class methods? I tried to pass one to it and it hangs. For some reason I can't debug it either. No breakpoints work in it.
L
z
17 comments
Hmm tbh, I'm not sure, never tried.

I would think it works though? If you interrupt when it hangs, what's the traceback like? Should be able to see what got stuck
it will probably just show me some asyncio crap. I'll have to make a separate example without async
am still working to get to the bottom of it. will update here in a bit
It seems to be that llama-index doesn't like the tool I'm trying to give it. Is there some limit to the size of the docstring?
Plain Text
WARNING:llama_index.core.chat_engine.types:Encountered exception writing response to history: Error code: 400 - {'error': {'message': '\'define_function(name: str, description: str, arguments: str, packages: str, code: str) -> str\\n\\n        Define a function to add to the context of the conversation. Necessary Python packages must be declared.\\n        Once defined, the assistant may decide to use this function, respond with a normal message.\\n\\n        Parameters:\\n        name (str): The name of the function to define. This will be used as the identifier for the function in the conversation context.\\n        description (str): A short description of the function. This should clearly explain the purpose and functionality of the function.\\n        arguments (str): JSON schema of arguments encoded as a string. This schema defines the expected arguments for the function.\\n                         Example format: \\\'{ "url": { "type": "string", "description": "The URL" }}\\\'. The schema should specify the type\\n                         and description for each argument.\\n\\n\\n        Returns:\\n        str: A message indicating that a function has been successfully added to the context of the conversation, along with its description.\\n        \' is too long - \'tools.0.function.description\' (request id: 2024040908542331434894662876070)', 'type': 'invalid_request_error', 'param': '', 'code': None}}
There is almost certainly something wrong with function calling/tools
Its hard to see, but the description is too long
is too long - \'tools.0.function.description\' at the end there
The openai api doesn't have this limitation
I got around it by moving a bunch of it to a system prompt. obviously not ideal
I mean, seems like it does have this limitation, because this error is coming directly from the API ?
It could also maybe be because the entire input to the LLM is too long, and it just decided to flag the description for some reason
We get the tools by using tool.metadata.to_openai_tool() for each tool

Then, this gets passed into the openai api client

Plain Text
tool_specs = [t.metadata.to_openai_tool() for t in tools]

response = client.chat.completions.create(
    messages=message_dicts,
    stream=False,
    tools=tool_specs,
    model=model,
    temperature=temperature,
    ....
)
I don't think it's that. Encountered exception writing response to history seems not to come from openai. I never had an issue with that function/tool before using it in llama-index. I don't think I added to its length since the last time I used it, but I'll run a more conclusive test against the API to be sure.

The issue about classmethods was a misdiagnosis.
It encountered an error writing the response, which seems to be the openai api error you pasted?
Which seems to stem from a tool description
If you did the same setup using agent.chat(), I would expect a similar api error
Add a reply
Sign up and join the conversation on Discord