I'm also working on an integration to use it for the router query engine (should hopefully help make the index selectors more stable by using pydantic)
Is the reason for putting the functions in a vector store so that the LLM can decide which function to use by checking the embeddings to find the most relevant function?
Would it make sense to create a function that takes in no params but then returns a JSON object of themes it found in a doc?
I basically want the input to be the context from my uploaded docs and then return a "summary" key for example with the value the LLM generated as the output