Find answers from the community

Updated last year

prompt

At a glance
I have this rag setup with sub queries, which isnt the main thing i dont mind no sub queries , but the problem is that even after the sub qwueries say no answer in the context given , i still get the answer from the llms knowledge . why is this how do i stop this from happening . right now i am using gpt 3.5/4. how do i prevent the rag from generating answer of something that isnt provided as context.
I just need it to not answer when the question is not in the documents i provided
Attachment
image.png
W
P
5 comments
did not work
also why is it that the answers generated for sub questions say not provided in context. how does the answer genration work there
Try setting the system_prompt in service_context.

Plain Text
from llama_index import ServiceContext

service_context = ServiceContext.from_defaults(
    system_prompt="<Your System Prompt Here>"
)


Try this once. This will be used for each operation.
Add a reply
Sign up and join the conversation on Discord