The community member created an agent that makes an external API call and retrieves JSON data of contact persons. They are having issues when passing this API output back to the language model (LLM). A comment suggests that the community member can provide a formatted response to the LLM, such as user_details="""user_name: {name}""", and then have the LLM complete the action using this data. However, there is no explicitly marked answer in the comments.
Hi all, I created an agent that makes an external api call and retrieves json data of contact persons. I run into problems in my workflow when this api output is passed back to the llm.
Is there a specific way in which I can have the llm handle the retrurned json without any complications?