My current pet peeve in journalism (and oh, there are many, so it’s sometimes hard to choose) is lines like this: ‘according to researchers…’ ‘a recent study…’ etc. With no reference as to the actual study, or, sometimes, the very names of the researchers. Let alone, in online journalism, a simple link to the study for the convenience of readers. Which means that if you want to take a look at the study for yourself, or you want to learn more about the credentials of the researchers, you have to go on a lengthy treasure hunt to find it.
This is infuriating, and it’s really not very good journalism. You need to cite your sources. You can’t just talk nebulously about some study somewhere. You need to explain which study it was, so that people know where you are getting your information. I thought this was pretty basic stuff, but evidently it is not, because half the time when I read science journalism from people who are not science journalists, I spend the next half hour trying to find the study or studies that were referred to in the article.
When I read an article that just says ‘sources say’ with no information about where those sources came from or who they are, I view it with suspicion. I’m going to take the same stance when a journalist doesn’t cite a study upon which an article is based; because I have no information about the study, who conducted it, where it was performed, who paid for it, and what kinds of biases might have been at play in the production of the study. This means that the conclusions reported as factual need to be evaluated with care and consideration, because there might be something deeper going on than what we see at the surface.
For example, a study by a pharmaceutical company revealing a new use for an existing medication is something to be taken with a grain of salt. There’s a clear financial bias right there; the company has a vested interest in coming up with new uses for its drugs, because these can be used to extend patents and increase sales. In contrast, an independent study undertaken without support from the pharmaceutical company that comes up with the same conclusion is far more interesting, because the outcome of the study would be less biased; the researchers were purely interested in knowing if the hypothetical medication had other potential applications.
Journalists are, by and large, well-read, well-informed people who are capable of critical thought. This is kind of required for the work. But many in the US seem to have poor scientific literacy, which is a reflection of the general state of education in the US, and a serious problem when they’re reporting on science in the news. If you don’t understand how science works and you’re reporting on science, you’re going to inevitably create some serious errors, and your readers, who may be even less literate than you, are not going to catch and address these errors on their own.
How many people see a note like ‘researchers say that…’ and think to look into who did the research and what the circumstances were, let alone actually look up the study and learn more about the specific nature of what the research did and did not show? Thanks to the way commentary spreads through online journalism and trickles down into blogs and other sites, a single article without a factual citation becomes horrifically magnified as people repeat it and repeat it, citing the original article as the source and not once referencing the study itself. The conclusions reached in the article, complete with whatever spin the journo wanted to create, are repeated endlessly, and readers come to take them as factual.
As the game of telephone progresses, it also gets harder and harder to trace things back to the original article and a chance at finding the actual study. ‘Researchers at the University of Leeds found that…’ becomes ‘British researchers…’ becomes just ‘researchers…’ and without the name of the paper, it becomes very difficult to trace the path of the information discussed. Sometimes you’ll be tantalised with the name of a journal, which makes things much easier to track down; oh good, now I only have to look through Pediatrics for the last six months to see if I can find this study, assuming the right journal was actually cited.
At a bare minimum, a discussion of a scientific study should tell the reader who the authors were, when it was published, and where it was published. That provides enough information to easily locate it. Better yet, the journalist could actually namecheck the study, and if the piece is in online format, could provide a direct link to the journal article or an abstract, allowing the reader to quickly and easily check the source and explore the story in more detail. What’s not acceptable is information so vague that it’s impossible to even find the study referenced, leading at least some readers to wonder if it even happened at all, or if the journalist is so wildly misinterpreting the conclusions of the study that it’s being deliberately obfuscated to make it difficult for readers to check the facts for themselves.
You don’t need to be a trained science journalist to cover issues of scientific interest in the news, or to reference research in the context of a larger piece you’re putting together, but you do need some basic scientific literacy and common sense. Rule number one: cite your sources. Make it easy for people to find scientific research. Expand the conversation.