Find answers from the community

Updated 9 months ago

Hello I am implementing llamaindex

Hello I am implementing llamaindex ContextChatEngine.
I am looking for ways to get a structured output from my chat engine. Is there a way I can use output parser/pydantic program like LLMTextCompletion with ContextChatEngine? Any suggestions please.
L
M
2 comments
Hmm, I don't think theres any structured output support yet for chat engines. You'd have to prompt the chat engine to output json, or make your own custom chat loop
Ok thanks Logan for your response
Add a reply
Sign up and join the conversation on Discord