I see QueryPipelines are being phased out (in favor of Workflows, I assume). Workflows are nice but I also know the DPSy folks have created a LlamaIndexModule to wrap a QueryPipeline, for prompt optimization. Are there any plans between LlamaIndex and DSPy to optimize prompts within Workflows, given this appears to be the new direction?
it'll certainly be an interesting item to try on the backlog to integrate workflows with DSPy. if you or others wanted to look into contributing that would be awesome, we'd love to promote
Our RAG infrastructure uses a QueryPipeline. We're now thinking of upgrading to Workflows.
But we also have a side-project to optimize all our prompts and I'm looking at DSPy as an option for that. I was delighted to see some examples of links between the two frameworks. But with the advent of Workflows, I'm unsure what's the best path to a healthy RAG pipeline with optimized prompts.
Hey there! It looks like you're navigating some interesting changes with the transition from QueryPipelines to Workflows, and I totally get how that could create uncertainty, especially when optimizing prompts for your RAG pipeline. The biggest challenge here is figuring out how to effectively integrate DSPy for prompt optimization within the new Workflow framework without losing the benefits that LlamaIndex offers. I recommend exploring ways to create modular prompts that can be easily adapted for both DSPy and LlamaIndex, allowing you to maintain consistency while leveraging their strengths. If you need any assistance in setting this up or optimizing your prompts, Iβd be more than happy to help! Best
@timothybeamish ahh I see. The equivalent of part 2 of that notebook in the Workflow setting would be something like a LlamaIndexWorkflowModule that takes in LlamaIndexWorkflows, and analyzes all the DSPy prompts within that workflow. Doesn't exist yet but would be certainly interesting to add.
Part 3 should still apply though. If you are able to use DSPy independently you could optimize your promptsin DSPy, and then just into any llamaindex prompt template which could be a part of a broader worfklow