We are all, ad nauseum, familiar with the debate over whether violent video games produce violent young people. You'd think that, after so many years of hyper-violent games and hysterical lawmakers, there'd be a consensus by now. So why isn't there?
Read on, and join in the discussion, after the jump.
The answer comes down to a few basic lessons in statistics:
Correlation does not prove causation.
Much, if not most, social research relies on correlations to show a relationship between two or more social forces. The thing is, a correlation can only show that there is a relationship - it is not capable of telling you why that relationship is there. For example, there's a correlation between height and weight - tall people tend to be heavier. That makes total sense. But a correlation is incapable of telling you why that is. You could infer from that correlation that growing taller makes you weigh more; and while that may make perfect intuitive sense, you could, with equal statistical authority (which is to say, none), infer that growing fatter makes you taller. The correlation does not tell you how that relationship works. In fact, you have no authority at all to infer a cause from a correlation.
So what does this have to do with violent video games? Well, people infer causality and correlation all the time. Whether it's strongly hinted, as in this article, or explicitly stated, as it is by so many noisy lawmakers, some violent people enjoy violent video games. Therefore, there is a correlation between violent people and violent video games. Therefore, violent video games cause violence in the real world.
I'm sure you can see the error in that line of thinking, but many people do not.
What's more, when studying entire groups of people and their behaviour, the correlation is often the easiest statistic to obtain. You want to find out if there's a relationship between violence and violent video games? Interview people who play violent video games and test how they score on a test of violent behaviour or violent thinking. It's cheap and easy to do. Then, when you find out that there's indeed a correlation between the two, you can tell the world that there's a correlation between the two.
Statistics are often misinterpreted.
But, you see, the media is not interested in saying "there's a correlation between violence and violent video games", and often times reporters are not educated enough to know that a correlation can't show a cause-and-effect relationship. But, vague statements about "correlations" or "regression analysis" don't sell papers. Therefore, scientists find that violent people play violent video games. Let the readers infer the rest. Slightly more savvy reporters will write something more along the lines of "Scientists find link between violence and violent video games." It suggests that there's a relationship without flagrantly misrepresenting the study that's being reported on.
As a side note, whenever you read or hear in the media that scientists have found a "link" between things, you may as well stop reading. Those scientists have probably started exploratory research, probably haven't explained the causal forces at work, and the journalist is probably just trying to make a story out of it. Move on, or read on with a great deal of skepticism.
Early research is often contradictory.
Generally speaking, it is in the early stages of the exploration a social phenomenon that one witnesses a great deal of correlational research. Traditionally speaking, research starts with a broad focus, and narrows down to finer detail as time goes on. For instance, what's the use in bringing out the big guns on a topic if there's no correlation to begin with? This is especially true in sociology and social psychology as human behaviour is extremely difficult to track in the lab.
What comes hot on the heels of correlational research is what's called a pre-test post-test design. Measure your participants' level of agression before playing a game, and then do the same after. You'll need some of them to play a violent video game, and some of them to play a non-violent video game (this is the "control group"). So that you don't have big differences between the groups, randomly choose who plays what. These sorts of studies in to video game violence have been going on for some time now. Researchers Karen and Jody Dill analysed the existing literature on the topic in 1998 and found something that's often found in early research: The studies that exist were contradictory, inconsistent and fraught with methodological problems. Though there can be no definitive answer, the most likely reason for the research being contradictory has to do with the latter two issues.
In a more recent analysis of the literature, researcher Christopher Ferguson found continuing methodological issues and even bias in current publications, primarily: Data that show no difference (that violent video games do not increase aggression any more than non-violent ones) are often unreported, or simply ignored by researchers in favour of trumpeting marginal, inconsistent, or barely-significant results. This is typical, as publishers are much more interested in printing studies that show a result (as though showing no difference is "not a result"). Further, studies that showed greater results tended to use measurements that were not "standardized" (that is, not previously demonstrated to prove what the researchers claim they prove).
"Statistically significant" doesn't necessarily mean "important".
In his review of the literature, Ferguson brings up an absolutely massive issue that is often ignored by researchers: A statistically significant variable may not have much effect at all on what you're studying. For example: Johnny wants to study video game violence. He recruits a thousand people for his study, and finds that there's a 0.5% increase in aggressive thinking after playing a violent video game. This result is statistically significant. ...But it accounts for a 0.5% change. Who cares?
Not to suggest any ulterior, sneaky motives, but much research in human behaviour does not measure the size of the statistically significant effect. One reason may be that researchers are not often taught about effect size. Another may be that reporting a small effect size may make your research less marketable to publishers. There could be any number of reasons; however, the fact remains that careful readers of research in to human behaviour will notice a troubling lack of researchers calculating effect size as well as "significance".
State versus trait and media cherry-picking.
Even if there were statistically significant research showing that violent video games have a great effect on people's behaviour, most of the non-correlational research out there only measures the effect of violent games immediately after gaming. Yet, the message that the talking heads in the media are sending is that long-term play of violent games changes people's behaviour. What researchers are researching is very different from the message that's getting to the public. Research indicates there may or may not be a short-term effect of violent video games on short-term behaviour. This is called a change in "state." Many media outlets and politicians have built a story telling us that violent video games change people in to violent criminals. This is called a change in "trait" - essentially, a change in personality. (It should be noted that whether or not a person's personality can be significantly changed after childhood is, and has been for some time, a hotly-debated issue in psychology research.) Kotaku has an interesting article touching on this, and on recent research, here.
What's more, with so much contradictory literature out there, media outlets are free to pick and choose which studies they wish to quote in order to build their story. Media and political distortion is horrible enough even when there is unanimous scholarly consensus. Research shows that GLBT people are harmed by reparative "therapy," that they are equally capable of parenting as heterosexuals, and that they are harmed by homophobic social messages; mountains of research from around the world supports climate change models, with only a few contradictory studies; across many disciplines there is absolute proof for the mechanics of evolution - proof that has been amassing for over a century. Imagine how distorted the media/political translation of the research will be with a topic on which there is no consensus.
What does all of this suggest for the future of the debate over violent video games? Well, don't expect it to be solved in the next few years. It is this author's experience that the crassest explanation is usually the correct one when it comes to debates such as this one: In the end, some politicians and media figures are making too much money and accruing too much power to let the story die. Meanwhile, research publications will continue to be biased toward "showing a result." However, the sophistication of the research in to this topic should continue to grow and with it, at some point, consensus.
[image via: Geeked]