Skip to 0 minutes and 10 secondsHere's an interesting fact. US-based surveys frequently find that impossibly large numbers of Americans have defended themselves with guns. For example, one such estimate implies that homeowners put up gun defences in well over 100% of the burglaries for which someone was home, awake, and there was a gun in the house. Similarly, such surveys imply that robbery and rape victims use guns against their assailants more frequently than assailants do against their victims-- not very likely. Yet these surveys don't seem to have any glaring methodological flaws, at least not of the sort we've discussed so far in this course. So what gives?

Skip to 0 minutes and 53 secondsWell, we've run smack dab into a subtle measurement issue that can rear up virulently whenever we try to measure rare events. Let's illustrate with a simple numerical example. We survey 10,000 people and assume that, in truth, 100 of them have defended themselves with guns. But our survey, like every other survey, doesn't work perfectly. Sometimes we interview a guy who has defended himself with a gun. But, for whatever reason, we fail to register him as such. And sometimes we interview a guy who has not defended himself with a gun but incorrectly put him down as a gun defender. Welcome to the real world of survey research. Mistakes were made.

Skip to 1 minute and 40 secondsYou may ask, why are we making these errors, a question to which there can be many answers. Maybe I claim falsely that I fought off a tough guy with a gun because saying this just makes me look and feel pretty macho. Or maybe a key puncher in the office just happens to hit the wrong key when looking ahead to her upcoming lunch break. In other words, there are plenty of chances to make mistakes. Now, let's just say that we've got a 1% error rate in each direction. In other words, out of the 100 people who have defended themselves with guns, one slips through our fingers and gets recorded as a non-gun defender.

Skip to 2 minutes and 24 secondsAnd out of the 9,900 people who haven't defended themselves with guns, we classify 99 as gun defenders, right? 1% of 9,900 equals 99. Holy guacamole. That's huge. We end up recording 198 gun defenders-- 100 minus 1 plus 99-- when the true number is just 100. In other words, we overestimate the number of gun defenders by almost a factor of 2. That can explain a lot. The key here is that even in America, defending yourself with a gun is still a rare event. The overwhelming majority of the population hasn't done this.

Skip to 3 minutes and 9 secondsBut when you apply 1% to a large number of people-- in this case, 9,900 people-- you get a much bigger number than you get when you apply 1% to a small number of people-- in this case, just 100 people. The errors in one direction don't even come close to balancing off the errors in the other direction. This problem can explain why multiple surveys overestimate the number of defensive gun uses in the USA. And the threat of similar overestimation looms large whenever we use surveys to measure the prevalence of rare events.

Assymetrical Errors

This video clip summarizes a great article by David Hemenway which has two main points:

  1. Survey-based estimates of defensive uses of guns must be too high because they don’t square with other available pieces of evidence.

  2. The article offers a theory, explained in the video, which can explain why these surveys overestimated the prevalence of defensive gun uses by so much. I’m hoping that this theory sounded familiar to you.

Discussion

What is the connection between Hemenway’s theory and the first 7 steps of this week?

Share this video:

This video is from the free online course:

Survival Statistics: Secrets for Demystifying Numbers

Royal Holloway, University of London