FunctionTool.from_defaults()
meant to be able to handle class methods? I tried to pass one to it and it hangs. For some reason I can't debug it either. No breakpoints work in it.WARNING:llama_index.core.chat_engine.types:Encountered exception writing response to history: Error code: 400 - {'error': {'message': '\'define_function(name: str, description: str, arguments: str, packages: str, code: str) -> str\\n\\n Define a function to add to the context of the conversation. Necessary Python packages must be declared.\\n Once defined, the assistant may decide to use this function, respond with a normal message.\\n\\n Parameters:\\n name (str): The name of the function to define. This will be used as the identifier for the function in the conversation context.\\n description (str): A short description of the function. This should clearly explain the purpose and functionality of the function.\\n arguments (str): JSON schema of arguments encoded as a string. This schema defines the expected arguments for the function.\\n Example format: \\\'{ "url": { "type": "string", "description": "The URL" }}\\\'. The schema should specify the type\\n and description for each argument.\\n\\n\\n Returns:\\n str: A message indicating that a function has been successfully added to the context of the conversation, along with its description.\\n \' is too long - \'tools.0.function.description\' (request id: 2024040908542331434894662876070)', 'type': 'invalid_request_error', 'param': '', 'code': None}}
tool.metadata.to_openai_tool()
for each tooltool_specs = [t.metadata.to_openai_tool() for t in tools] response = client.chat.completions.create( messages=message_dicts, stream=False, tools=tool_specs, model=model, temperature=temperature, .... )
Encountered exception writing response to history
seems not to come from openai. I never had an issue with that function/tool before using it in llama-index. I don't think I added to its length since the last time I used it, but I'll run a more conclusive test against the API to be sure.