Five things to remember when reading reports on scientific studies

Nothing's black and white

We see it over and over again: A new health study is published in a peer-reviewed medical journal, and it’s picked up by the media (thanks to the accompanying press release), summarized by reporters, and then given a provocative headline by their editors.

I recently came across a press release from the National Institutes of Health, announcing the results of a study reported in the New England Journal of Medicine that concluded that “coffee drinkers have lower risk of death.”

The researchers followed over 400,000 men and women from 1995 to 2008, aged 50-71 at the outset.  They excluded people who already had cancer, heart disease, and stroke. Coffee consumption was assessed — just once! — at the start, via self-report.*  They then looked at how many people died in the thirteen years that followed.

I want to use this particular study today, not to discredit it, but rather, to use it as an example of how things can get twisted and overblown in the reporting and analysis.

1. Correlation does not equal causation

Just because people drink coffee doesn’t necessarily mean it’s the coffee that helps them live longer. This study showed only a correlation. In fact, the study showed that coffee drinkers overall actually had a higher risk of death:  “In age-adjusted models, the risk of death was increased among coffee drinkers.”

They then point out that coffee drinkers were more likely to smoke (why don’t we wonder if coffee caused them to smoke?) — but after they adjust for “tobacco-smoking status and other potential confounders, there was a significant inverse association between coffee consumption and mortality.”

So what’s that saying?  It means they have to do some statistical analysis to mathematically try to remove smoking and other probably causes of death from the equation. (In this case, they also note that the mathematical results were similar for a subgroup of people who never actually smoked — so that’s a good indicator that their math is probably correct.)

Ultimately, though, the conclusion says this: “Whether this was a causal or associational finding cannot be determined from our data.”

So it does not mean that if you drink coffee, you’ll live longer because you drink coffee.  It means that people who drink coffee happen to live longer (and it doesn’t even say how much longer). It may have nothing to do with the coffee itself — it may just be the type of person who drinks coffee is more likely to live longer, for some completely other reason.

2. Headlines can be just plain wrong

In today’s incredibly fast, not-so-detail-oriented news cycle, it’s the headline that gets the most attention.  The nitty-gritty details of the articles (and studies) are often missed.  Some news outlets are very careful with their words, but others tend to be a bit more alarmist — and therefore, probably more inaccurate.

Take these headlines, for example, about the coffee study:

  1. Coffee Drinkers Have Lower Risk of Overall Death, Study Shows (ABC News)
  2. Can Coffee Help You Live Longer? We Really Want To Know (NPR)
  3. Study: Coffee lowers risk of death (The Columbus Dispatch)
  4. Coffee Lowers Disease Risk: Study (The Daily Beast, which is part of Newsweek)
  5. Coffee-drinking lowers risk of death, big study finds (The Boston Globe)
  6. Coffee Shown To Reduce Risk Of Death: Study (NBC Miami)
  7. Coffee Reduces Death Risk (About.com)

I’d say that ABC’s headline (#1) is probably the most accurate to the study’s findings, since they say “Coffee Drinkers” instead of Coffee.  NPR (#2) dodges the problem by turning it into a question. The remaining five are completely wrong interpretations of the study.

As I mentioned above, coffee was not found to causatively lower the risk of death — the study authors specifically point out there was only a correlation!  But when you read those headlines, it’s hard not to think, “Great! I’ll drink more coffee so I’m less likely to die!”

3. Reporting is an interpretation

Just as the headlines can be misleading, so can be the reporting. By definition, if you’re reading an article from a news outlet, you’re reading a reporter’s interpretation of the study. Good reporting, of course, will be very careful to be factually correct. But sometimes little details can shift, ever so slightly, and when it comes to studies like this, little details can make large differences. Even in large, reputable news organizations.

As a simple example, a Los Angeles Times article says “And the link was stronger in coffee drinkers who had never smoked.” That’s sort-of correct, but not technically accurate. As I mentioned earlier, the study found that the coffee drinkers who also smoked had a higher risk of death, but when mathematically adjusted to account for the smoking, the association between coffee and death was “similar” (according to the abstract on the NEJM website).

My point is that once a study is interpreted by someone else, the details may shift, even just slightly, and in something where correlation and causation are already fuzzy, it can make a big difference in how we, the casual readers, interpret the findings.

4. Study authors and funders are often biased

I don’t think this is necessarily the case in this particular coffee study, but it bears pointing out: It’s really, really important to know who is paying for a study, and who designed and conducted it.  Although ethical standards require disclosure for conflicts of interest, they don’t require recusement the way a judge needs to recuse himself or herself from a trial. The prevailing practice is that disclosure is enough — as if knowing that someone is biased is enough to eliminate the bias itself (guess what? it isn’t).

The reputable medical journals are, of course, generally good at disclosing these conflicts of interest.  But oftentimes those details are buried, or omitted altogether, by the time it gets to the mainstream news.

If the authors of this coffee study owned stock in Starbucks, would that change your opinion of their findings? Probably on an intellectual level, yes. But I bet there would still be that little voice in the back of your mind that now thinks it’s a good idea to drink more coffee.

5. Percentages are incredibly misleading

This is the biggest pitfall, in my opinion, because it is so often overlooked: Studies report the amount of change as a percentage of difference from the original statistic.

In this case, the study found that “relative to men and women who did not drink coffee, those who consumed three or more cups of coffee per day had approximately a 10 percent lower risk of death.”

Okay, so coffee drinkers were 10% less likely to die.  Here’s the important part: That does not mean the likelihood of dying was fully 10% less than before. What it means is that it was a 10% change from the original likelihood — not a 10% change in the the overall statistic.

In the coffee study, they tracked 229,119 men.  During that time, a total of 33,731 of them died. So the overall odds of dying for a man in this group was 14.7%.

The study found a roughly 10% change in the death rates of coffee drinkers.  Ten percent of 14.7% is 1.47%.  So, a little bit of math reveals that the coffee drinkers had a 13.23% “chance” of dying — which is reported as a 10% improvement. But really, it’s just a 1.47% improvement overall.

So all these headlines that claim “Coffee reduces death risk”?  It means that if you’re a coffee drinker aged 50-71, you have a 13.23% chance of dying over the next thirteen years instead of a 14.7% chance.

(These numbers are looking at overall totals — in reality we’d need to split it up into sub-groups, like smoking-coffee-drinkers and non-smoking-coffee-drinkers, and how many cups of coffee they drank each day… but I’m simplifying a bit to make the point.)

This may be statistically significant, and certainly is a strong enough result to warrant further research into the potential death-defying properties of coffee, but it hardly warrants bumping your daily coffee intake to four or five cups, like the headlines would lead you to believe.

So next time you’re reading about a study, keep all this in mind, and take everything with a big grain of salt.**

* Recall reporting is notoriously lousy; people under- or over-report almost without fail. And in this case, they asked someone to say how much coffee they currently drank — and then did not follow up on the coffee-drinking-habits over the next thirteen years. How many people quit drinking coffee? How many people increased their intake? Keeping an eye on where the data comes from should probably be it’s own item in the above list.

** But not actual salt, since that’ll raise your blood pressure and correlate to a 15.2763% greater chance of death…

Photo by Alvin Trusty, used under Creative Commons License.

A photo of Andrew Wilder leaning into the frame and smiling, hovering over mixing bowls in the kitchen.

Welcome to Eating Rules!

Hi! My name is Andrew Wilder, and I think healthy eating doesn’t have to suck. With just three simple eating rules, we'll kickstart your journey into the delicious and vibrant world of unprocessed food.

You May Also Like:

Subscribe
Notify of
guest
Name
Email

This site uses Akismet to reduce spam. Learn how your comment data is processed.

24 Comments
Inline Feedbacks
View all comments
June 8, 2012 3:33 pm

Of course there is a 6th thing to remember and that is next week there will be a new study totally contradicting the one you are trying to understand. GREG

Karen
June 5, 2012 11:49 am

“Coffee Drinkers Have Lower Risk Of Overall Death” — that sounds to me like they either cheat death completely (and never die), or some parts of them keep living. Like, “Okay, but she didn’t die OVERALL, her hand is still alive, thanks to coffee!”

June 5, 2012 8:18 am

When I was a statistician, I would always tell people to distrust scientific studies reported in the mainstream media. I’m always happy when I see other people stresses it too.

marilyn kyle
June 5, 2012 6:26 am

Reminds me of one of my fav bumper stickers:
“65% of all statistics are wrong”

June 5, 2012 3:34 am

Great article. While this geek knows statistics, he has a hard time explaining statistics to others. This article will help in the future. Great work.

June 5, 2012 2:03 am

Very useful article Andrew, well done! Being a masters student in Health Psychology I know all too well how research findings are misinterpreted. Especially since most of the public doesn’t bother reading the actual source or doesn’t know how too (this should be taught early on in the education system). Thanks again for your hard work.

June 4, 2012 2:19 pm

I love that you did this, Andrew. So many people have no idea what goes on during research and studies and what it all means. I think learning all that in my Methodology class in college made me quite skeptical of anything so I’m always reading reports with many grains of salt.

Dennis
June 4, 2012 2:09 pm

Good points and a little fine tuning. RE: Percentages are incredibly misleading: A easy way to remember this is look for the words “absolute” number percentages vs “relative.” That’s what you are talking about but you can almost automatically reduce the effect if they are using relative numbers (not always stated) to a small fraction of the actual effect, which you have shown in your example. But if they use those larger relative numbers, difference are more likely to reach “statistical significance” than with the absolute numbers, which means what is the likelihood of you getting the result by chance? If small enough (e.g., .001), it’s not a chance (but real)effect. On the flip side, often a result DOESN’T reach statical significant but they accept the result as if it did, because it came close. Uh uh,, doesn’t work that way. It does or it doesn’t reach significance and if… Read more »

Reply to  Dennis
June 5, 2012 7:22 am

Another thing to remember about study authors: in general, when dealing with articles from reputable academic centers, the first author is the guy who did the bulk of the work. The second author is the guy who wrote most of the paper (if the first guy didn’t write it himself), and the last author is typically the Chief of Service or Department Chair, who probably never even saw the research. Stuff coming out of industry-sponsored labs may not adhere to this standard.

marctr
June 4, 2012 1:58 pm

I love this! Forwarded forwarded forwards aren’t nearly as popular as they used to be, but those used to be my big pet peeve. Did anyone check the source?!? Of course not! My sil is trying to lose weight and she keeps telling me all the latest weight loss “secrets”–eating watermelon, don’t drink water with your meals, etc. How about don’t eat a 4th meal before you go to bed? (her weakness.) People always want to find validation for their habits, and the media complies. It might make them feel better, but it won’t help them get whatever result they are looking for. Your tips should go without saying, but thanks for saying them.

Joe
June 4, 2012 12:55 pm

You did it again, Andrew! Love the article. I think about the same thing every time there’s an article trying to create a pretty random correlation between two things without taking into affect a lot of the other aspects out there. There are so many variables in things like that and they just try to dumb it down to two things to make it an interesting read. I studied Finance in college and took some marketing classes. That stuff is used all over the place constantly and I don’t think most people realize just how manipulated you are with your money and lifestyle. Totally oblivious.

1 2 3