Setting custom output parser for structuredllm wraps sagemakerllm
Setting custom output parser for structuredllm wraps sagemakerllm
At a glance
The community member is trying to set a custom OutputParser that inherits from PydanticOutParser to StructuredLLM that wraps SageMakerLLM. However, they found that the output parser is hard-coded to PydanticOutParser in the llama_index.llama-index-core.llama_index.core.program.utils.py file. The community member is unsure if this is a bug or if there is another way to set a custom OutputParser.
In the comments, another community member suggests that the community member could use LLMTextCompletionProgram themselves and customize the output parser there.
I trying to set a custom OutputPaser that inherits from PydanticOutParser to StructuredLLM that wraps SageMakerLLM, But every time it set in output_parser to PydanticOutParser
I dug down the code and found that in the file llama_index.llama-index-core.llama_index.core.program.utils.py the output parser is hard coded and set to PydanticOutParser
Is this a bug? or there is another way to set custom OutputPraser