Ben Goldacre, The Guardian, Saturday 5 June 2010
“Fish oil helps schoolchildren to concentrate” was the headline in the Observer. Regular readers will remember the omega-3 fish oil pill issue, as the entire British news media has been claiming for several years now that there are trials showing it improves school performance and behaviour in mainstream children, despite the fact that no such trial has ever been published. There is something very attractive about the idea that solutions to complex problems in education can be found in a pill.
So have things changed? The Observer’s health correspondent, Denis Campbell, is on the case, and it certainly sounds like they have. “Boys aged eight to 11 who were given doses once or twice a day of docosahexaenoic acid, an essential fatty acid known as DHA, showed big improvements in their performance during tasks involving attention.” Great. “The researchers gave 33 US schoolboys 400mg or 1,200mg doses of DHA or a placebo every day for eight weeks. Those who had received the high doses did much better in mental tasks involving mathematical challenges.” Brilliant news.
Is it true? After some effort, I have tracked down the academic paper. The first thing to note is that this study was not a trial of whether fish oil pills improve childrens’ performance, it was a brain imaging study. They took 33 kids, divided them into 3 groups (of 10, 10 and 13 children) and then gave them either: no omega-3, a small dose, or a big dose. Then the children performed some attention tasks in a brain scanner, to see if bits of their brains lit up differently.
Why am I saying “omega-3”? Because it wasn’t a study of fish oil, as the Observer says, but of omega-3 fatty acids derived from algae. Small print.
If this had been a trial to detect whether omega-3 improves performance, it would be laughably small: a dozen children in each group. While small studies aren’t entirely useless, as amateurs often claim, you do have a very small number of observations to work from, so your study is much more prone to error from the simple play of chance. A study with 11 children in each arm could conceivably detect an effect, but only if the fish oil caused a gigantic and unambiguous improvement in all the children who got it, and none on placebo improved.
This paper showed no difference in performance at all. Since it was a brain imaging study, not a trial, they only report the results of children’s actual performance on the attention task in passing, in a single paragraph, but they are clear: “there were no significant group differences in percentage correct, commission errors, discriminability, or reaction time”.
So this is all looking pretty wrong. Are we even talking about the same academic paper? I’ve a long-standing campaign to get mainstream media to link to original academic papers when they write about them, at least online, with some limited success on the BBC website. I asked Denis Campbell which academic paper he was referring to, but he declined to answer, and passed me on the Stephen Pritchard, the Readers Editor for the Observer, who answered a couple of days later to say he didn’t understand why he was being involved. Eventually Denis confirmed, but through Stephen Pritchard, that it was indeed this paper(http://qurl.com/denis) from the April edition of the American Journal of Clinical Nutrition.
If we are very generous, is it informative, in any sense, that a brain area lights up differently in a scanner after some pills? Intellectually, it may be. But doctors get very accustomed to drug company sales reps and enthusiastic researchers who approach them with an excitingtheoretical reason why one treatment should be better than another (or better than life as usual without the miracle treatment): maybe their intervention works selectively on only one kind of receptor molecule, for example, so it should therefore have fewer side effects. Similarly, drug reps and researchers will often announce that their intervention has some kind of effect on some kind of elaborate measure of some kind of surrogate outcome: maybe a molecule in the blood goes up in concentration, or down, in a way that suggests the intervention might be effective.
This is all very well. But it’s not the same as showing that something really does actually work, back here in the real world, and medicine is overflowing with unfulfilled promises from this kind of early theoretical research. It’s not even in the same ballpark as showing that something works.
And oddly enough, someone has finally now conducted a proper trial of fish oils pills in mainstream children, to see if they work: a well-conducted, randomised, double-blind, placebo-controlled trial, in 450 children aged 8–10 years old from a mainstream school population. It was published in full this year (http://qurl.com/fish), and they found no improvement. Show me the news headlines about that paper.
Meanwhile Euromonitor estimate global sales for fish oil pills at $2bn, having doubled in 5 years, with sales projected to reach $2.5bn by 2012, and they are now the single best selling product in the UK food supplement market. This has only been possible with the kind assistance of the British media, and their eagerness for stories about the magic intelligence pill.
Stuff:
You might also enjoy this takedown of the Observer piece by Dorothy Bishop, professor of neuropsychology in Oxford, and this at HolfordWatch. I should say, it makes me admire the Guardian even more that they publish a column like this about one of our own news articles. Although I do think the headline they used (“Omega-3 lesson, not so much brain boost as fishy research“) is wrong, as it wasn’t the research that was problematic, it was the reporting of it.