Bicycle Helmets and the law: a perfect teaching case for epidemiology.

December 13th, 2013 by Ben Goldacre in epidemiology, risk, statistics | 15 Comments »

Hi all, I haven’t posted much on badscience.net due to exciting home events, fun dayjob activity, a ton of behind-the-scenes work on trials transparency with alltrials.net, activity on policy RCTs, exciting websites, and a zillion talks.  I’m going to post this year’s backlog over the next week or two (and maybe rejig the site if I get a chance). So first up…

Here’s an editorial I wrote in the British Medical Journal with David Spiegelhalter, about the complex contradictory mess of evidence on the impact of bicycle helmets. Like most places where there’s controversy and disagreement, this is a great opportunity to walk through the benefits and shortcomings of different epidemiological techniques, from case control studies to modelling. Epidemiology is my dayjob; Bad Science and Bad Pharma are both, effectively, epidemiology textbooks with bad guys; and since the techniques of epidemiology are at the core of most media stories and squabbles on health, it’s very weird that you don’t hear the word more often. More on that in another journal article, which I’ll post later on! Read the rest of this entry »

Is this the worst government statistic ever created?

April 23rd, 2012 by Ben Goldacre in economics, evidence based policy, government reports, politics, pr guff, statistics | 24 Comments »

I forgot to post this column up last year. It’s a fun one: the Department for Communities and Local Government have produced a truly farcical piece of evidence, and promoted it very hard, claiming it as good stats. I noticed the column was missing today, because Private Eye have published on the same report in their current issue, finding emails that have gone missing through FOI applications, and other nonsense. That part is all neatly summarised online in the Local Government Chronicle here.

Is this the worst government statistic ever created?

Ben Goldacre, The Guardian, 24 June 2011.

Every now and then, the government will push a report that’s so assinine, and so thin, you have to check it’s not a spoof. The Daily Mail was clear in its coverage: “Council incompetence ‘costs every household £452 a year’“; “Up to £10bn a year is wasted by clueless councils.” And the Express agreed. Where will this money come from? “Up to £10bn a year could be saved … if councils better analysed spending from their £50bn procurement budgets.” Read the rest of this entry »

These Guardian / Independent stories are dodgy. Traps in data journalism.

December 30th, 2011 by Ben Goldacre in guardian, numerical context, statistics | 13 Comments »

Here’s an interesting problem with data analysis in general, and so, by extension, data journalism: you have to be careful about assuming that the numbers you’ve got access to… really do reflect the underlying phenomena you’re trying to investigate.

Today’s Guardian has a story, “Antidepressant use in England soars“. It’s much more overstated in the Independent. They identify that the number of individual prescriptions written for antidepressant drugs has risen, and then assumes this means that more people are depressed. But while that’s a tempting assumption, it’s not a safe one. Read the rest of this entry »

What if academics were as dumb as quacks with statistics?

October 3rd, 2011 by Ben Goldacre in methods, neurostuff, statistics | 39 Comments »

Ben Goldacre, The Guardian, Saturday 10th September 2011

We all like to laugh at quacks when they misuse basic statistics. But what if academics, en masse, deploy errors that are equally foolish? This week Sander Nieuwenhuis and colleagues publish a mighty torpedo in the journal Nature Neuroscience.

They’ve identified one direct, stark statistical error that is so widespread it appears in about half of all the published papers surveyed from the academic neuroscience research literature. Read the rest of this entry »

Benford’s Law: using stats to bust an entire nation for naughtiness.

September 23rd, 2011 by Ben Goldacre in crime, economics, statistics, structured data | 8 Comments »

Ben Goldacre, The Guardian, Saturday 17 September 2011

This week we might bust an entire nation for handing over dodgy economic statistics. But first: why would they bother? Well, it turns out that whole countries have an interest in distorting their accounts, just like companies and individuals. If you’re an Euro member like Greece, for example, you have to comply with various economic criteria, and there’s the risk of sanctions if you miss them. Read the rest of this entry »

Brain imaging studies report more positive findings than their numbers can support. This is fishy.

August 26th, 2011 by Ben Goldacre in academic publishing, publication bias, regulating research, statistics | 22 Comments »

Ben Goldacre, The Guardian, Saturday 13 August 2011

While the authorities are distracted by mass disorder, we can do some statistics. You’ll have seen plenty of news stories telling you that one part of the brain is bigger, or smaller, in people with a particular mental health problem, or even a specific job. These are generally based on real, published scientific research. But how reliable are the studies?

One way of critiquing a piece of research is to read the academic paper itself, in detail, looking for flaws. But that might not be enough, if some sources of bias might exist outside the paper, in the wider system of science.

Read the rest of this entry »

Sampling error, the unspoken issue behind small number changes in the news

August 22nd, 2011 by Ben Goldacre in bbc, media, statistics, uncertainty | 19 Comments »

Ben Goldacre, The Guardian, Saturday 20 August 2011

What do all these numbers mean? “‘Worrying’ jobless rise needs urgent action – Labour” was the BBC headline. They explained the problem in their own words: “The number of people out of work rose by 38,000 to 2.49 million in the three months to June, official figures show.”

Now, there are dozens of different ways to quantify the jobs market, and I’m not going to summarise them all here. The claimant count and the labour force survey are commonly used, and number of hours worked is informative too: you can fight among yourselves for which is best, and get distracted by party politics to your heart’s content. But in claiming that this figure for the number of people out of work has risen, the BBC is simply wrong.

Read the rest of this entry »

Anarchy for the UK. Ish.

April 3rd, 2011 by Ben Goldacre in presenting numbers, statistics | 29 Comments »

Ben Goldacre, The Guardian, Saturday 2 April 2011

Here are two fun ways that numbers can be distorted for political purposes. Stop me if I’m boring you, but each of them feels oddly poetic, in its ability to smear or stifle.

The first is simple: you can conflate two different things into one number, either to inflate a problem, or confuse it. Last weekend, a few hundred thousand people marched in London against the cuts. On the same day, there was some violent disturbance, windows smashed, policemen injured, and drunkeness. Read the rest of this entry »

How to read a paper

January 29th, 2011 by Ben Goldacre in bad science, mail, statistics, sun | 49 Comments »

Ben Goldacre, The Guardian, Saturday 29 January 2011

If science has any authority, it derives from transparency: you can check the claims against the working. Sometimes you hit a brick wall. Sometimes you might consider a shortcut. Let’s look at 3 types of checking. Read the rest of this entry »

Putting a number in its context

January 8th, 2011 by Ben Goldacre in bad science, numerical context, statistics | 28 Comments »

Ben Goldacre, The Guardian, Saturday 8th January 2011

600 pregnancies despite contraceptive implant” said the BBC.  “500 fall pregnant after having contraceptive implant” said the Express. “Contraceptive implant alert” said the Daily Mail: “Hundreds of women fall pregnant after birth control fails”. Read the rest of this entry »