In addition to not being the best at math, another area where journalists tend to struggle is science. This makes reporting on scientific and medical studies difficult because we have a tendency to take the findings of the study at face value without digging in and asking the same types of questions we would ask if the source of the material were a government agency or a business.
The results are that we latch onto the headline or the simplest explanation and either misinform our audiences or scare them while we are misinforming them.
For example, it is common knowledge among parents that screen time is linked to obesity. Why is it common knowledge? Because the media reported on several studies linking obesity television as well as smaller screens, even though some of the studies themselves indicated that one can’t necessarily directly blame television for obesity.
Enter a story published recently by the Associated Press about parental guilt and screen time. As it turns out, many of the screen time studies we have been quoting and reporting on haven’t taken into account socio-economic factors.
According to the article, children who consume more television also tend to be poorer, have less educated parents, and belong to minority groups. All of these factors are also linked to obesity in and of themselves. As with most societal problems, there are many factors that contribute to childhood obesity, not just one.
However, nobody’s going to click on a headline that reads “Socio-economic status, other factors may lead to obesity,” so we settle on the low-hanging fruit: “Is Your TV Making Your Kids Fat?”
Don’t do that. Most scientific studies will say that correlation does not equate to causation. Most of these studies we report on are reporting on two or three items that are linked, yet our reporting often leads people to believe that thing one actually causes thing two.
This is the opposite of journalism. The goal is to inform, not confuse. Instead, we need to start looking more critically at these studies. Who commissioned the study? Who benefits from the findings? What factors did the study not take into account? Where did the study come from?
Sometimes the study is coming from a person or entity trying to make a name for themselves. They repackage a boring or ill-planned study into something that looks interesting, send a press release out with a can’t lose headline, and an unsuspecting journalist gets played like a fiddle.
A journalist actually tried this a few years ago by doing a study that boldly proclaimed chocolate causes weight loss. Not only was the study intentionally based on bad science to prove a point, but journalists bought it without question. The headline was just too good.
It would be best for journalists to treat press releases about scientific studies the same way we treat press releases from government entities and politicians. Question them, get the full scope of the story, and interview some experts when necessary.
Be First to Comment