Hey all, is the prompt framework in Llama Index compatible with LangChain prompt solutions? I'm just learning these things and I was wondering if its possible to use both at the same time? LangChain has a library tool for prompts that looks neat and I'm curious to know what is involved in working with the two frameworks (if at all) and any considerations I should have going into this phase. I'm currently building a query engine tool that doesn't chat with end users, I just need to build prompts to guide the LLM's answers and output. Thanks kindly for any guidance getting this going!
@Logan M thanks for your suggestion. Where can I look for a more in depth tutorial on using prompts with query_engines and retrievers? I'd like to use some of the benefits from LangChain, but I'm not sure where to look for guidance. Have you come across any papers/tutorials that you think do a good job laying out the groundwork for using both (LLama Index + LangChain prompt frameworks)? I have a bunch of vector indices now made with Llama Index, but I kinda get the sense that LangChain does more work on the prompt side of things.