Find answers from the community

Updated 4 months ago

Refine

At a glance
Hi, I’m trying to understand prompts and the refine_template in particular. Reading the doc page here https://docs.llamaindex.ai/en/stable/examples/prompts/prompts_rag.html I can see there is a "new context" field in the refine_template. But there is no mention of where this new context is coming from. What is the default and how do we change it?
L
S
2 comments
There's two steps in a query engine

retrieval and response synthesis

The default response mode is compact, which means it takes all the retrieved nodes and tries to compact them into larger prompts (to reduce LLM calls)

If the (compacted) retrieved text does not fit into one LLM call, it will use the refine template.

This means we have an existing answer, but more context that the LLM hasn't read yet. So we show the LLM the existing answer and next piece of context, asking to either repeat or update its answer
ok clear now. thanks.
Add a reply
Sign up and join the conversation on Discord