Optimizing structured output generation with the Vercel AI SDK and Anthropic's Claude model
Optimizing structured output generation with the Vercel AI SDK and Anthropic's Claude model
At a glance
The community member is using the Vercel AI SDK and Anthropic's Claude Sonnet 3.5 model to generate structured output. The responses are a reasonably sized JSON object, but there is a delay of 3-5 seconds before the next fields in the object (including a nested JSON object with a complex schema) are populated. The community member assumes this is due to the function calls needed to get structured data from Claude, and is wondering if there is a way to decrease the delay, as it is a rough user experience. Another community member has suggested they may have a solution to the problem.
We’re using the Vercel AI SDK to generate structured output via the streamObject function with Anthropic’s Claude Sonnet 3.5 model. Our responses are a reasonably sized json object (~5kb), but the challenge is that it seems to populate the first two fields (string that are a paragraph or two long) quickly, then think for 3-5 seconds before populating the next fields in the object (one is a nested json object with a decently complex schema). We assume this is due to the function calls needed to get structured data back from Claude since it doesn’t have native support, but we’re wondering if there’s a way to decrease that delay, as it’s a really rough user experience