Bad Science Reporting

More old news today! I’m on an old news roll.

Recently, a fair amount of attention has gone to a study conducted at the University of California in which participants were fed diets high in either glucose or fructose for two weeks. During the study and at the end, changes to their health were carefully tracked. The study noted that people on a high fructose diet appeared to have trouble processing the sugar, and that deposits of new fat cells appeared around their digestive tract.

This study had 32 participants. With a sample size this small, you are obviously laying the groundwork for more research. The study results seemed to suggest that there was some interesting stuff going on, and that it might be a good idea to explore the idea a bit further. I haven’t been able to find specific information about exactly how high the diets were in fructose and glucose, respectively, which would be helpful when looking at the results, but, as a general rule, I would describe the findings as “interesting.”

That’s not how the media described it.

“Study Shows High Fructose Corn Syrup May Cause Obesity, Diabetes, Heart Disease” “Child Diabetes Blamed On Food Sweetener” “Too Much Sugar Is Bad”

This is a common trend that I see with science reporting, and it’s starting to piss me off. I get that people want to use sensationalized headlines to attract attention, and while I am not exactly pleased with the use of misleading headlines, I understand why it is done. Commonly, however, the errors in the headline are repeated in the article; the entire piece becomes a laundry list of sensational and ludicrous claims which are based in a gross misreading of the study.

“Study Shows That More Research Is Needed Into Dietary Sugars,” I admit, does not have the same ring. It doesn’t grab the eye and demand that people look at the article. But it would be more accurate. And the contents of the articles could have been much more balanced. They could have focused on the small sample size, they could have talked a little bit about variables, they could have discussed the implications of the study without necessarily stating that fructose causes things. There appears to be a correlation or link which suggests that we should learn more. That is not the same thing as identifying a cause.

Errors in science reporting irritate me both because I don’t like to see studies mischaracterized, and because I think that they do science a disservice, and they create unreasonable expectations in members of the lay public. Not describing science accurately, not taking the time to write a more nuanced piece, leads people to come up with some very odd ideas about science and how scientific research works.

Most people aren’t familiar with how scientific research is performed, how to analyze the results of scientific studies, and how the scientific community regulates itself when it comes to publishing results and making statements. Thus, they can’t read articles like these critically, looking between the lines for the real information and perhaps noting to themselves that the story is actually more complicated than it appears at first glance.

We are increasingly a society in which members of the lay public are very uninformed about how things work, and this is actually quite damaging. When people don’t understand the mechanics behind scientific research, how economists think, what’s really going on in Congress, it’s hard to have an informed opinion on it. It’s difficult to make an informed choice.

Misreporting of scientific information in particular around food has, I think, created a lot of faddish attitudes. People are constantly changing the balance of their diets because the latest study says this, or that, or the other thing. Sometimes studies actively contradict each other and people struggle to balance them. This is because they read headlines like the ones above and go “oh, high fructose corn syrup is bad, I need to eliminate it,” instead of seeking out more information and making a choice about what they want to do; reduce, cut it out altogether, wait and see what further studies show, etc.

Fad dieting has been shown in a number of studies (including studies with very large sample sizes, and studies which have been conducted over years and sometimes decades) to be harmful. There are clear signs that jerking the diet around, not eating in a balanced way, and eating a highly restrictive diet are all potentially harmful and sometimes even dangerous, depending on whether or not someone has underlying health issues.

When science misreporting sends people off on another tangent of fad dieting, I’d argue that it’s harmful. Reporters who discuss science issues need to be more conscientious about how they report science, to make sure that they are representing information reasonably, fairly, and accurately. To make sure that they are not leading readers to conclusions which were never actually reached in the study being covered.

To write about a study like this and to conclude that high fructose corn syrup is bad and evil is food shaming, of a form, and it’s also wrong. What this study shows is that in 16 people, eating a diet high in fructose appears to be linked with the formation of fat deposits (fatty liver in particular is a health concern). Which seems to suggest that maybe we need a more focused long-term study including people who eat varying amounts of fructose, with controls for assorted variables, to see if it is indeed fructose that is causing these problems, and at what level fructose in the diet becomes a cause for concern.

One Reply to “Bad Science Reporting”

Comments are closed.