I have a bit more of a targeted use case maybe someone has tried that I'd like to discuss Essentially I'm trying to provide a list of questions for an LLM to answer from a document (who was involved, what day did this occur), and return these answers in a more structured format (csv, json, whatever) to begin building a structured table that could then be queried by SQL (probably using an NL-SQL model) I'm trying to understand if anyone has tried to generate more structured data out of the summarised responses from an LLM and if there is a good llamaindex/langchain way to go about this, short of generating custom prompts to do this entirely?
ohhhh its super tied to openai. It's using their function calling api
I have seen llama2 be used in a similar function calling api (i.e. llama-api, which we also support). But tbh open-source models are not great with structured outputs right now π€