Find answers from the community

Updated last month

Setting custom output parser for structuredllm wraps sagemakerllm

Hi everyone,

I trying to set a custom OutputPaser that inherits from PydanticOutParser to StructuredLLM that wraps SageMakerLLM,
But every time it set in output_parser to PydanticOutParser

I dug down the code and found that in the file llama_index.llama-index-core.llama_index.core.program.utils.py
the output parser is hard coded and set to PydanticOutParser

Is this a bug? or there is another way to set custom OutputPraser


snip of the code

return LLMTextCompletionProgram.from_defaults(
output_parser=PydanticOutputParser(output_cls=output_cls), # type: ignore
llm=llm,
prompt=prompt,
kwargs,
)

possible fix

return LLMTextCompletionProgram.from_defaults(
output_parser=PydanticOutputParser(output_cls=output_cls) if prompt.output_parser is None else prompt.output_parser, # type: ignore
llm=llm,
prompt=prompt,
kwargs,
)
L
1 comment
You could use LLMTextCompletionProgram yourself and customize the output parser there right?
Add a reply
Sign up and join the conversation on Discord