Hi is there anyway to prevent hallucinations? I have uploaded data about some of the skincare products and when I ask if they are dermatologically tested, it says even when it's not specified anywhere. Also if I ask for promotions or offers it asks to go on Promotions or Offer pages even when its not feeded those details. I have asked in the prompt to only answer based on the context information. Still its hallucinating.
A thorough prompt with all the possible instructions , combination of ALL CAPS to signify major part of the instruction can help the LLM to work better.
But this will only help in case of OpenAI LLM or Good open-source LLM with instruction following features