Find answers from the community

Updated 3 months ago

When converting a function into a tool

When converting a function into a tool via FunctionTool() is there a way include any extra prompt along the tool? My tool returns a list of links like: ['<a href="localhost/article/98">The US is proposing ...</a>'], problem is, when the response is synthetised, the LLM removes the markup and returns: [The US is proposing ...](localhost/article/98)
L
v
12 comments
hmm, you could try adding extra description to the docstring of the function, or pass in the description manually πŸ€”

Or maybe you can also add to the system prompt of the agent?
ah you tried the description already. System prompt may help then

i.e agent = OpenAIAgent.from_tools(..., system_prompt="my system prompt")
I thought about the system prompt, but I have multiple tools, so on a per tool basis would be best
will try to include the instruction as a comment in the Fn body
It could be generic too. Something like "When you use a tool, maintain the formatting of the tool output"
thanks, will try bot. So I need to inject it somehow, there's no documented approach for it right now from what I understand
o.O
Including it into the funciton body did the trick:
Plain Text
def latest_news_headlines():
  """Get the latest news headlines. Keep the HTML markup when returning the results."""
...
wondering why having the same instruction in the description didn't have the same effect
Plain Text
latest_news_tool = FunctionTool.from_defaults(
  fn=latest_news_headlines,
  name="latest_news",
  description="Get the latest news headlines. Keep the HTML markup when returning the results.",
)
Hmmm, that is a little suspect haha. I'm not sure either πŸ€” Something must be different internally between how the two are presented to the LLM
aham, will dive into the logs
Add a reply
Sign up and join the conversation on Discord