A recent survey suggests that 49 out of 50 surveys presented on mainstream news outlets cause cancer.
It's well known that stress increases the risk of cancer. So a survey announcing a correlation between cancer and toast needs to provide sufficient cancer avoiding informed choice to out-weigh the
inherent stress and cancer causing agent that is the survey itself.
By applying the scientific method, before releasing a survey on the general public we take a second survey to assess the stress of the particular report's finding. For example -
Survey suggests : a 0.1% increased risk of cancer by eating blackened toast over a lifetime.
Survey survey suggests : a 0.2 increased risk of cancer by panicking about the above survey.
Clearly, the toast survey if released to the public will cause *more* cancer. To further discount statistical nonsense we require a factor of 4 over the survey survey, that is, the cancer risk of toast needs to be 0.8%. We could go 20 times and require 4% extra risk from eating black toast.
It only took a 1% margin for people to be prescribed statins and the research behind that was dubious.
This idea would reduce the number of rubbish surveys of the kind that takes a particular dataset, correlates it in a particular way and generates a positive 0.001% bias causing the headline "Toast Kills!".
Maybe an OfSurvey could come up with some way of grading the survey result to translate into plain speak which should be declared on all news pieces. If the survey isn't graded, it's bollocks, otherwise the piece is only newsworthy if its grade exceeds something worthy of attention. People would soon learn to skip news segment if the number top right was less than 1.0.