When using Query Pipelines is there a way to format the outputs of the intermediate steps? At some steps, I would like to output a JSON string and save the intermediate output. I am currently using the prompt to generate JSON output, but it is not consistent. Thanks in advance.
Yes, I am doing exactly that (correcting the JSON in a pipeline component) and it works well for gpt-3.5, but for gpt-4 I keep getting messages before and after the JSON that my script can't handle. I can improve my script for cleaning up the JSON output, but I just wanted to know if there was a way to call for gpt-4's JSON output feature from the pipeline.
Have you tried just using the tool calling api instead? (i.e OpenAIPydanticProgram?)
I think you can activate json mode by setting it in the constructor too llm = OpenAI(..., additional_kwargs={"response_format": {"type": "json_object"}})
Yes, I saw that tutorial, but I didn't realize that I can set JSON mode in the constructor. That looks promising and I am going to give that a try. Thank you!
I'm getting this error when using gpt-4: "Invalid parameter: 'response_format' of type 'json_object' is not supported with this model." I'll see what happens with gpt-3.5...