Wonderful article by Ben Goldacre on how writing his equally wonderful Bad Science column has “increased his suspicion of the media by, ooh, a lot of per cents”. I'll resist the urge to quote the entire thing, although its general brilliance makes it hard to pick out a single quote.
Once journalists get their teeth into what they think is a scare story, trivial increases in risk are presented, often out of context, but always using one single way of expressing risk, the “relative risk increase”, that makes the danger appear disproportionately large (www.badscience.net/?p=8). This is before we mention the times, such as last week's Seroxat story, or the ibuprofen and heart attack story last month, when in their eagerness to find a scandal, half the papers got the figures wrong. This error, you can't help noticing, is always in the same direction.
As a science graduate (and I'm not sure that one really stops being a scientist, to be honest), I frequently find myself gurgling in despair at the news, declaiming something like “Well, it's clear they've never studied statistics!” whilst trying not to let my blood pressure rise too much. I really do think that some of the bollocks that gets passed off as 'science' in the media could be countered by simply teaching journalists about stats, particularly about what makes a study statistically significant (or not).
Oh, that and giving them a massive electric shock every time they sensationalise. That might work.