Find answers from the community

Updated 6 months ago

Prompts

At a glance

The post asks if there are ways to alter the underlying prompts and flows within query engines or retrievers, such as changing the prompt for how sub-questions are created in the sub-question retriever and how answers are synthesized.

The comments provide some information on how to customize prompts in LlamaIndex, a library for working with large language models. Community members mention that you can change the prompt for the sub-question query engine, and that there are two ways to generate questions: OpenAIQuestionGenerator and LLMQuestionGenerator, both of which use a default prompt that can be modified. However, there are no explicit examples provided on how to use the LLMQuestionGenerator.

There is no explicitly marked answer in the comments.

Useful resources
Are there any ways currently to alter the underlying prompts and flows within some of the query engines or retrievers? Like altering the prompt for how the sub questions created in the sub question retriever, how the answers are synthesized? Thanks!
W
L
5 comments
Ref on how you can customize the prompt in LlamaIndex: https://docs.llamaindex.ai/en/stable/module_guides/models/prompts.html

Also for sub query engines: You can change the prompt based on your need from here: https://github.com/run-llama/llama_index/blob/main/llama_index/query_engine/sub_question_query_engine.py


It has two ways to generate the questions:
  • OpenAIQuestionGenerator
  • LLMQuestionGenerator
Both of them use a default prompt which you can modify!
@WhiteFang_Jr How to generate questions from LLMQuestionGenerator?
Some examples to use it?
Not sure if there is direct example of using it, But you can check the generate method in it and follow on the requirements.

https://github.com/run-llama/llama_index/blob/cdfebe6f438461549a5b3647c697bf4bbaa51d47/llama_index/question_gen/llm_generators.py#L61
Add a reply
Sign up and join the conversation on Discord