Many people are familiar with the Andrew Wakefield study, which purported to find a link between vaccines and autism. The study became the driver for an entire movement, with vaccine deniers gathering behind it and using it as ammunition to not vaccinate their children or delay vaccination schedules. Perhaps unsurprisingly, an uptick in vaccine-preventable illnesses and deaths occurred as a result, and, notably, as scientists struggled to replicate the results and took on an analysis of Wakefield’s methods, they reached an inescapable conclusion: It was bad science. So the study was retracted, fairly quietly, although some media outlets covered it, and people advocating for public health attempted to make people more widely aware of the retraction.
This is often how it goes. Science is imperfect, about testing and exploring theories, and sometimes those theories are wrong, or the testing reveals unexpected results. Researchers write them up to provide information to fellow researchers, to explore where they might have gone wrong, to expand discussion and knowledge in their field. Maybe they’re doing original work, perhaps it’s a review of literature, or maybe it’s an attempt to replicate studies to confirm their validity. Scientists make methodological mistakes, studies go awry because they don’t have the information they need, or new developments reveal that their work was imperfect or incomplete.
When researchers are confronted with the fact that the results of a study are questionable or no longer valid, the responsible thing involves sitting down to review both the study and the conflicting evidence. They may meet with peers and editors at scientific journals as well as the researchers who disproved or questioned their work. The goal is to determine whether the allegations that there’s a problem are in fact merited. Responsible researchers who agree that their work is flawed can opt to retract a study — to make a clear public statement that the study’s conclusions were wrong and articulate why.
Retraction Watch is an organisation that keeps track of ongoing developments across the sciences, taking note of as many retractions as the editors can spot — they freely admit that they miss some, since there is no central database. When I started exploring the site, following my own interests in the sciences and how researchers interact with the general public, I was struck by a few things: There are far more retractions than I’d realised; scientists aren’t that good about communicating with the public when it comes to discussing retractions; and many people are not aware of retractions and thus rely on some bad science to advance views — because many members of the public have limited scientific literacy and don’t understand that a single paper or study doesn’t provide enough material to make generalisations.
I knew that retractions were an ongoing thing with all scientific journals, no matter how rigorous their peer review and editing processes. Journals do their best to avoid having to pull a paper, and are unafraid in many cases to tell researchers to reconsider their work — at least, this is the case with reputable journals, and many members of the public can’t tell the difference between solid sources and those that are dodgy, because they haven’t been provided with the tools for judging research journals. There is a sense at times that only researchers sift through such material and thus that it doesn’t need to be accessible to the public — and that members of the public therefore don’t need to learn about how to spot a bad journal.
I was surprised, however, by how frequent they are, though I really shouldn’t have been. Between the eternal problem of sketchy science and new developments in the field, retractions happen, and they are fact a healthy part of the sciences. It would be nice if every single scientific paper ever was 100 percent accurate, but that’s not the case. Learning that a paper is wrong can actually be incredibly informative, adding more depth and understanding to the research and helping contributors learn how to do better work in the future in addition to providing opportunities to pursue research in new directions. Retractions aren’t dead ends, but forks in the road.
Scientists, however, aren’t good at communicating news to the public, which is an ongoing issue. Most retractions fly right by under the radar, with people blissfully unaware that researchers have determined their conclusions were erroneous or flawed. That might not seem like such a big deal, but it means that people aren’t aware of retractions that might affect their lives, and it also means that people are rather shocked when retractions happen, turning them into a considerable event without understanding that they’re actually a normal part of being active in the sciences.
Moreover, because people don’t know about retractions and assume that science is gospel, they cling to research without really understanding it. Aside from the literacy problem of relying extremely heavily on one or two studies — and regurgitated versions of said studies, because many laypeople don’t read the research itself — people operate in ignorance of the fact that the studies they base sweeping beliefs on are actually wrong and have since been retracted. With a growing interest in pseudoscience, this is a huge problem, because unscrupulous people use sketchy and since-retracted papers to justify absurd diets, push inaccurate information on people, and promote harmful social attitudes.
When reading about any kind of research, it’s important to actually go to the source. It’s also important to balance that source out with literature reviews and studies on related subjects — trust me, Andrew Wakefield wasn’t the only person researching vaccine side effects. And it’s critical to do some hunting and confirm that a paper hasn’t been retracted, because if it has, you need to find out why and learn about new developments in the field. You don’t need to be a scientist to do these things, but you do need to be willing to spend a little time digging.
Image: SCIENTISTS PARKING ONLY, evan p. cordes, Flickr