The more you play, the more aggressive you become

That is the title of a paper by Brad Bushman and colleagues published in the Journal of Experimental Social Psychology online on 22 Nov 2012. But wait, there is more; the subtitle reads: 'a long-term experimental study of cumulative violent videogame effects on hostile expectations and aggressive behavior'. Read the whole paper here.

I know what you are thinking

Yes, I know this one has a high-voltage live wire in it, but bear with me; I promise it pays off. It's a long one so you may want to skip to the highlights at the end. If you decide to read through it might help you appraise science better and seek original sources, rather than read what the journalists report.

Why did I write this?

I came across this as I was going through my daily readings of new research in psychiatry. It is not really new, but it was pushed in my face on Medscape. I suppose not even medical media is immune to the base appeal of an eye-grabbing story. I write this because chances are it will be levied against us advocates for games and I want to help my fellow gamers be prepared. I also write it because I'm on a plane and I don't fancy reading at the moment.

The foregone conclusion

The paper concludes that exposure to violent videogames over time makes players more aggressive. Let me quote from the abstract (which is what any science reporter is ever going to read). 'It is well established that violent videogames increase aggression. There is a stronger evidence of short-term violent videogame effects than of long-term effects' [...] 'As expected, aggressive behavior and hostile expectations increased over days for violent game players, but not for non violent videogame players, and the increase in aggressive behavior was partially due to hostile expectations.'

So there it is, that proves it: videogames cause violence. Right? I expect that is the take-away message from this paper for most science reporters. I am sorry to say that I believe it is also the most likely take-away message for most of my peers.

Well, actually; not really, no

Brad and his mates clearly have an agenda. It is interesting how the title does not really match their stated results even in the abstract. They say 'the more you play the more aggressive you become', not 'the more you play violent games the more aggressive you become'. People in their study playing non-violent games had no increase in their measure of aggression. They are clearly misleading in their title showing their hand.

But, that's not all. I believe after reading the whole paper that they really misrepresent their findings in a way that seems to me deliberate. In fact, after reading their paper I am no wiser to the question of whether violent videogames induce violence or not. I learned nothing at all.

Let me entertain you

Hopefully, although a degree of nerdiness is required for this to be an enjoyable experience. I am now going to do the bit medical students enjoy the most when I take them through it. It's called critical appraisal, or paper-slaying as it is technically known. Before I do that one more comment on the title. Since when is 3 days long-term?

Brad and his merry band of researchers do not tell us what type of study this is, but if you are a nerd like me you can gather that it is a double blind randomised controlled trial. The best way of judging this type of study is to unpick that. The questions are: is it double blind? is it randomised? Is it controlled? Is it a trial? Are the stats kosher? When they look at aggression are they really measuring aggression? How big is the difference between the groups? Is the follow-up long enough? What does that difference mean in real life?

Is it double blind?

Double blind means both researchers and participants should not know what group they are assigned to. This is to prevent bias.


Let's start with the participants. In the study they appear to blind the participants by lying to them about the purpose of the study when they recruit them -- they tell them it is about screen brightness rather than aggression. When they finish the study they interview the participants again to see if they clocked what the study was about. Brad and his confederates tell us participants never knew what the real question in the study was.

This sounds fine, but here's what I don't like: they never tell us what specific questions they ask in the debriefing interview. They tell us the topics they probe, but I really want to see those questions as I can write a leading question if I need to, can't you?


Next up is the question as to whether the researchers were blinded or not. Brad and his mates tell us they were blind to what type of game the participants played when they rated the essays. This sounds good but there is an issue. To measure aggression they participants play a competitive game where they get to blast their opponent with a nasty noise if they beat them at the game and they can choose how loud the noise is and how long they play it for.

My question is: were the researchers blinded to the type of game the participants had played when they got them to play the measure of aggression game? Where they encouraging a group more than the other? We know from Milgram's classic experiments that researchers can really influence how aggressive participants will be in research situations.

My verdict: I do not have enough evidence to be satisfied the study was properly blinded. This is particularly important given that Brad and his mates have what appears to be a pet hypothesis (based on how they chose to write the title and abstract).

Is it randomised?

In other words, do participants have equal chances of ending up playing violent or non-violent videogames? Well, Brad et al. tell us they do. They don't tell us how they do it though, which is really naughty. They also don't show us the typical nifty little table that tell us if the groups are similar or different in terms of potential confounders such as age, gender, previous history of violence of the individual to name a few. If the rates of these things are more or less the same in both groups then we can believe a bit more that they really did flip a coin. The only potential confounder they did look at after collecting the result data was types of games usually played by the participant. They tell us was the same between both groups. This is actually an extraordinarily important point and I will come back to it later.

My verdict: I have no confidence that they randomised appropriately. I don't know if everyone with a history of violence ended up in the violent videogame group, for instance

Is it controlled?

Seems obvious, right? There are two groups: one playing violent videogames and one playing non-violent videogames. If the violent videogame group is more aggressive on average that means violent videogames predispose you to being aggressive. Well, this could all be the result of recurrent priming, which is kind of established as something that affects behaviour, at least in the short-term. So I would have liked to see playing violent videogames versus watching violent films. The way the control group is set up does not help to answer the question: is it violence in games or is it violence in media that affects the outcome?

Also, there is no 'not gaming' control. It might be that non-violent games reduce aggression as compared to people who don't play games at all or ho play violent games.

My verdict: no, it is not properly controlled.

Is it a trial?

Ok, that it is. I won't quibble this point. It is just a not very well designed one.

My verdict: Yes, it is a trial.

Are the stats kosher?

What I mean, are the statistical tests used fit to answer the question. I won't bore you with the details but the latent growth curve analysis that they use seems like a very odd choice to me for something that happens over three days and has just three data points. You usually use it over a longer time period and with more data points. Some possible growth models are automatically excluded by having only 3 points. A simple test comparing means would be more appropriate and it is much more often used in RCTs

My verdict: Maybe. They do not justify their choice, which is important as they are not the standard stats used in RCTs. With their method it is impossible for me to calculate a Number Needed to Harm (how many people do you need to get playing violent videogames to cause one extra incident of aggression that you wouldn't otherwise get) which is a key piece of information to evaluate the risk that violent games may or may not pose. Rather suspect if you ask me.

Are they really measuring what they say they measure?

Well, they do go on about how ethical this or that is in the lab and they say that measuring whether someone plays a loud noise at someone else during competitive play is a measure of aggression. They say 'Basically, within the ethical limits of the laboratory, participants controlled a weapon that could be used to blast their opponent with unpleasant noise.' Weapon? Really? What, like the vuvuzela is a weapon? If that is not reaching I don't know what is.

My verdict: I accept that noise-blasting is an established lab measure of aggression. I have trouble believing that a greater willingness to annoy someone in the context of a competitive game is a valid proxy for willingness to hit someone. Aggression in this lab context means something very different to what you or I think of when we hear the word 'aggression'.

Was it long enough?

To call 3 days long-term is rather cheeky.

My verdict: very misleading!

How big is the difference?

Hard to tell. Brad and his mates make up a scale combining duration and loudness of the unpleasant noise going from 1 to 10. This measure is labelled 'aggression'. They don't give you the average scores for the groups, which I find really annoying. The baseline for the scale as measured by the average scores for the non-violent videogame players day 1 is around 3.5. The average at the end of 3 days is 4 for non-violent videogame players and for the violent videogame players is 7. The mean difference is 3 points on this scale. Forget the fancy stats they parade, that's the stuff we need to be paying attention to.

My verdict: The real-life meaning of those 4 points is so obscure that I don't know how big it is. It would have been good to get a measure for how distressing that level of noise was, whether it was just mildly annoying or burst-your-eardrums painful. In summary: who knows.

Is the difference meaningful in real life?

Well, again who knows? Non-violent videogame players were 4 points annoying and violent videogame players were 7 points annoying. Does that mean they are more likely to shout at their granny or happy-slap a policeman? No idea. Brad and his friends do not look at that at all. We have no idea about the predictive validity of that scale versus any real-life measure of aggression; the Overt Aggression Scale, for instance.

Also, the group of participants are a very narrow slice of the world at large. I don't think many conclusions can be drawn about what effects violent or non-violent games have in the real world based on what you see in 70 university students. They don't tell us their age, if they drink, use drugs, smoke, how much time they spend playing games, what kinds of games do they normally play, do they have a history of violence. All important things that can affect how we can apply what the study says to what happens in the real world.

My verdict: Brad does not give enough info to judge, but I think the difference is unlikely to be meaningful. I say this based on their own evidence. I'll explain it later, but it is related to the fact that players in both groups played on average the same amount of violent games presumably for years before the study started.

My conclusions

I don't think this study shows anything at all. I am no closer to understand the relationship between between violent videogames, non-violent videogames and real-life aggression after reading this study. There may or may not be a relationship. This study gives no credence to either hypothesis.


Overhyped title that reveals their underlying beliefs. Particularly the omission of the word 'violent games' after the word 'play' in the main title is very telling.

No good info on participants. We don't know anything about them other than the fact that they are university students and 50% of them are women.

No good info on blinding particularly for the facilitators of the 'aggression'-measuring games.

No good info on randomisation. How did they do it? How do they know it was properly random? Seventy people is a small group and you can easily get a skewed distribution of key confounders (history of violence or drug-taking) between the groups.

No proper controls. All could be explained by priming rather than gaming.

No proper measure of aggression

Misrepresents 3 days as 'long-term'

The difference between the groups is small and probably meaningless. They state there were equal numbers of players of violent games in both groups who presumably played for years yet the error bars even at the start are very tight around the mean. If there were players of violent games in both groups and the effects get worse with time: A) I would expect some existing violent gamers in the non-violent game player group to overlap with the violent gamer group, particularly at the start of the trial, and B) according to their own argument three days extra is unlikely to make a huge difference compared to years of playing, yet it seems to do.

Final thought

Violent games may prime us for aggressive behaviour and this is something we all need to be aware of as gamers. However, if the effect is there it is unlikely to be strong. We have millions upon millions of violent game players and the overall rate of real violence (not blasting people with an annoying, but otherwise harmless noise) has declined in the populations with most access to violent games (see adoption rates of CoD: modern warfare 4 and Steve Pinker's book 'the better angels of our nature' for a DIY correlation). I don't think the decline is thanks to games, but it makes it hard to blame games for whatever violence still exists in those communities. There are so many gamers and the violence is so rare that it is like saying: 'having a right hand makes you violent because most violent people have a right hand'.

Don't let them befuddle you!

Views: 231


You need to be a member of Gameful to add comments!

Join Gameful

Comment by Anthony M. Karanovich on January 15, 2013 at 2:07pm

I will tell you the truth. Video games are not what make people violent. If you can't tell the difference between reality and virtual reality then you have no purpose playing a game in the first place. I can tell you from experience because I am a former spec op veteran with multiple deployments to Afgahnistan and Iraq. Even though I have been to the battlefield this still did not make me a violent person. It has everything to do with the mental health, living conditions and social skills.

Comment by Andres Fonseca on December 22, 2012 at 7:49am

Nice piece of journalism. Quite a thoughtful view of the issue. Thanks for the link, Sergio.

Comment by Sergio Schuler on December 22, 2012 at 6:32am


Welcome Gameful Monster!

Welcome to Gameful, which we are happy to report is officially a partner with Games for Change!

After you signup, here's what you can do to get started:

  • Explore the groups. Find one that interests you? Join it! Want to invite friends and find new allies? Create your own group.
  • Looking for an opportunity? Looking for collaborators? Check out the classifieds. By the way, here is a Gameful opportunity
  • Check the webinars and learn more about what we love the most
  • Invite friends and group members from Gameful 1.0 to join this shiny new one
  • Make a blog post
  • Friend and talk to the Mayor, or chat with anyone online (lower right)
  • Stay tuned for Gameful challenges, where you can design your own social impact game!
  • Take off your shoes and enjoy :)

Thanks everyone!


© 2020   Created by Mod.   Powered by

Badges  |  Report an Issue  |  Terms of Service