PG-13 Rated Movies, Societal Violence and the "Yes I Said It/No I Didn't" Game

03/26/2015 02:08 pm ET | Updated May 26, 2015

Back in 2013 researchers from Ohio State University and the Annenberg Public Policy Center at the University of Pennsylvania published a study suggesting a rise in gun violence in movies over the past half-century, particularly in PG-13 movies. Although the authors had no causal evidence linking such films to youth violence (nor even correlational evidence), they nonetheless implied a direct link between such films and youth violence stating "The effects of exposure to gun violence in films should not be trivialized. . . . The mere presence of guns in these films may increase the aggressive behavior of youth."

Except during the past few decades in which this rise in PG-13 movie gun violence is supposed to have occurred, youth violence plummeted by nearly 90%. This article represents an usual circumstance in which scholars not only make the mistake of implying causation from correlation...but the correlation is in the complete wrong direction.

This did not go unnoticed by other researchers at Villanova University and Rutgers led by Patrick Markey. Markey and colleagues conducted time-series analyses and found no evidence for a relation between movie violence and real life homicide and aggravated assault (if anything they found that violent movies seem to be related to a lowering of real world violence). The Villanova/Rutgers team concluded with concerns about premature causal attributions regarding movie violence on societal violence and cautioned scholars from making the kinds of premature causal inferences that were made by the Ohio State/Annenberg team.

This represents a pretty straightforward exchange of supposition ("Movie violence may be related to real-life violence") fact-checked against data ("No it isn't.") But these two articles only begin a rather unusual exchange.

In the same journal, the Ohio/Annenberg team commented on the new analysis by Markey and colleagues, claiming that neither they, nor anyone else in the media violence field were suggesting violent media is related to horrific acts of violence. This seems to have been an unusual strategy as it gave Markey and colleagues an opening, in their subsequent reply, to provide a list of 28 statements by media scholars, including by members of the Ohio/Annenberg team, of times when they linked violent media to tragic events like Columbine and the Aurora, Colorado shooting...including 2 from Ohio/Annenberg's own article in which they deny making those very sort of statements. At one point the Ohio/Annenberg team acknowledge comparing the magnitude of correlation between media and aggression was important medical outcomes (a claim which itself has long been debunked) but state they never said media violence is comparable the dangers posed by the smoking and lung cancer or other medical concerns. To this Markey and colleagues provided a further list of statements in which such comparisons were indeed made, including this bold comment by members of the Ohio/Annenberg research team "There are at least six instructive parallels between the smoking and lung cancer relationship and the media violence and aggression relationship."

This presents a remarkable example of one common problem for the media violence field I refer to as the "Yes I said it/no I didn't" phenomenon. Using this approach, scholars may feel free to make exaggerated claims and engage in considerable hyperbole then retreat and claim victim status when other scholars criticize them for this. Very often this is done with weasel words like may or might. This is also the same logic that politicians and politically charged news networks employ. They can say things like "President Obama might enjoy drowning little puppies." When someone complains and points out the factual error in this statement they retort "Hey, we said might." This is not only a problem in politics but is a very serious cultural problem with this field of research that keeps it high on hysteria and low on careful objectivity.

Another serious issue is the field's willingness to be slippery on data. And their reply to Markey and in a subsequent post by Annenberg, the Ohio/Annenberg team ignore a whole host of data indicating a long term declining trend in societal violence, including a decrease in gun related homicides and decreases in nonfatal firearm crimes including by youth during the past 20 years. Instead these researchers only focus on the specific category of gun related injuries. The authors seem to conflate gun injuries in which youth are injured by guns to youth violence in which youth use guns. Not all injuries are due to violent crime of course; many of the consequence of accidents and tragic suicides. And just because youth are on the receiving end of gun injuries does not mean they are trigger happy themselves. In fact, there clearly is no evidence for this. Data from the Bureau of Justice Statistics (see Table 4) document significant declining trends for both fatal and nonfatal gun violence among youth. It's time to stop pretending otherwise. Why are we inferring gun violence among youth from injury data when we have actual data on gun violence?

The bottom line is that there is little evidence to support concerns that PG-13 rated movies are sparking an epidemic of gun violence among youth because there is no epidemic of gun violence among youth. In an absence of such an epidemic, some researchers may be too quick to cherry-pick and selectively report data on societal violence to give their research on topics like media violence more gravitas. It crucial for us to change our academic culture to encourage scholars to become more conservative in extrapolating beyond their data. Let's not talk about smoking and lung cancer unless we're doing research on smoking and lung cancer.