Find answers from the community

Updated 7 months ago

Worlflow

At a glance

The community member created an agent that makes an external API call and retrieves JSON data of contact persons. They are having issues when passing this API output back to the language model (LLM). A comment suggests that the community member can provide a formatted response to the LLM, such as user_details="""user_name: {name}""", and then have the LLM complete the action using this data. However, there is no explicitly marked answer in the comments.

Hi all, I created an agent that makes an external api call and retrieves json data of contact persons. I run into problems in my workflow when this api output is passed back to the llm.

Is there a specific way in which I can have the llm handle the retrurned json without any complications?
W
1 comment
You can give back llm a formatted response,that should work

user_details="""user_name: {name}
"""
llm_action=llm.complete("here is the user details: user_details\n do the following actions using this data")
Add a reply
Sign up and join the conversation on Discord