The post asks if there are ways to alter the underlying prompts and flows within query engines or retrievers, such as changing the prompt for how sub-questions are created in the sub-question retriever and how answers are synthesized.
The comments provide some information on how to customize prompts in LlamaIndex, a library for working with large language models. Community members mention that you can change the prompt for the sub-question query engine, and that there are two ways to generate questions: OpenAIQuestionGenerator and LLMQuestionGenerator, both of which use a default prompt that can be modified. However, there are no explicit examples provided on how to use the LLMQuestionGenerator.
There is no explicitly marked answer in the comments.
Are there any ways currently to alter the underlying prompts and flows within some of the query engines or retrievers? Like altering the prompt for how the sub questions created in the sub question retriever, how the answers are synthesized? Thanks!