I trying to set a custom OutputPaser that inherits from PydanticOutParser to StructuredLLM that wraps SageMakerLLM, But every time it set in output_parser to PydanticOutParser
I dug down the code and found that in the file llama_index.llama-index-core.llama_index.core.program.utils.py the output parser is hard coded and set to PydanticOutParser
Is this a bug? or there is another way to set custom OutputPraser