Find answers from the community

Home
Members
YoniPick
Y
YoniPick
Offline, last seen 2 months ago
Joined October 29, 2024
Hi everyone,

I trying to set a custom OutputPaser that inherits from PydanticOutParser to StructuredLLM that wraps SageMakerLLM,
But every time it set in output_parser to PydanticOutParser

I dug down the code and found that in the file llama_index.llama-index-core.llama_index.core.program.utils.py
the output parser is hard coded and set to PydanticOutParser

Is this a bug? or there is another way to set custom OutputPraser


snip of the code

return LLMTextCompletionProgram.from_defaults(
output_parser=PydanticOutputParser(output_cls=output_cls), # type: ignore
llm=llm,
prompt=prompt,
kwargs,
)

possible fix

return LLMTextCompletionProgram.from_defaults(
output_parser=PydanticOutputParser(output_cls=output_cls) if prompt.output_parser is None else prompt.output_parser, # type: ignore
llm=llm,
prompt=prompt,
kwargs,
)
1 comment
L