In a typical RAG pipeline, is there a dedicated node responsible for evaluating whether augmentation/external knowledge is needed in the first place? If yes, what's it called?
Typically, if I start a conversation with a chatbot with 'Hi', I don't need it to go look at my vector store for relevant chunks. Or rather maybe the search happens anyways, but I don't need it to include any additional context into its input prompt in order to answer me.
The closest thing I found is a hybrid search with an intersection logic as described here:
https://docs.llamaindex.ai/en/stable/examples/query_engine/CustomRetrievers.html