Hello I am implementing llamaindex ContextChatEngine. I am looking for ways to get a structured output from my chat engine. Is there a way I can use output parser/pydantic program like LLMTextCompletion with ContextChatEngine? Any suggestions please.
Hmm, I don't think theres any structured output support yet for chat engines. You'd have to prompt the chat engine to output json, or make your own custom chat loop