A Columbia University doctoral student in epidemiology and professors from the NYU School of Public Health, the BU School of Public Health, and the Penn School of Medicine published a study last week in The BMJ (formerly known as the British Medical Journal) that purports to have found that “states with more permissive gun laws and greater gun ownership had higher rates of mass shootings, and a growing divide appears to be emerging between restrictive and permissive states.”
Studies like this are often, unfortunately, publicized without much critical thought. An article in The Houston Chronicle claims that this study “pushes back against” the argument that one’s personal safety is increased by owning a firearm. This study does no such thing, but why let a detail like this derail some anti-gun media bias at its worst.
The study’s researchers used The Traveler’s Guide to the Firearm Laws of the Fifty States to give each state an annual rating between 0 (completely restrictive) and 100 (completely permissive). This is a central component of their analysis but the Traveler’s Guide was not designed for this use. The ratings in the Guide are arbitrary and seemingly give each law the same weight when some laws are more onerous to gun owners than others. Even Daniel Webster, the Bloomberg Professor of American Health at the Bloomberg School of Public Health and the Director of the Johns Hopkins Center for Gun Policy and Research, raised this issue with Vox, telling them that indices like this make “it hard to draw concrete policy lessons from findings attached to such indices.” The Vox article spreads more inaccuracies about gun laws and related research than we can address here, but including even a bit of criticism is a welcome change from how other media outlets regurgitate the flawed findings of anti-gun researchers.
This is not to disparage The Traveler’s Guide. It is both interesting and useful but it was designed as a quick reference for traveling gun owners. Using it in an attempt to quantify the differences in gun laws between states is ill-conceived at best. The study’s researchers also failed to reach out to the author of the Guide, so it’s clear they also had no additional insight into the rankings.
We couldn’t find an explanation for the scoring in the Guide, but more recent versions have detailed the reasons for a change to a state’s score for a given year. In 2018, Arizona’s score was stable from the prior year and this was the explanation: “wide open desert & Constitutional carry make it one of our best.” Arkansas saw a score increase of 3 points for enacting an enhanced concealed carry permit, while Iowa saw stand your ground, a preemption upgrade, and State Capitol carry enacted and only gained five points. Even if there were a formula behind the score, it would still be arbitrary. There are also some odd categorizations in the Guide. For example, California is listed in the 2010 edition as having “unrestricted, no permit or license required” to own a firearm but the state began requiring residents to obtain a handgun safety certificate before they could acquire a handgun in 2001.
The full data set was not available at the time of this article, but the charts included in the study show that Massachusetts, Connecticut, Illinois, and Maryland are all more restrictive than California during the study’s time period (1998-2015). Does that sound right to you? It doesn’t sound right to us.
But – again – the scores are based on a guide written for travelers and there is no described system for assigning the scores. It is arbitrary and for informational purposes only. It was not designed for use in an analytical model.
The annual score is not the only issue with this Columbia study. The researchers used a proxy for gun ownership rate that is the ratio of suicides involving firearms to total suicides in a state. This proxy is widely accepted but is not typically used as an independent variable, the central variable of interest in an analysis. Some reasonable control variables were included, but violent crime rate and age cohorts were not. There was no mechanism to control for enforcement of laws, which is important when comparing a range of policies across states. The outcome variable presents the next considerable issue with this analysis. The outcome variable is the number of mass shootings per million people in a state, drawn from the FBI’s Uniform Crime Report Supplementary Homicide Reports, which does not include all homicides from all states and do not include the state of Florida at all. Excluding one of the most populous states is odd.
Let’s look at the data the researchers did use. They “found” 344 mass shootings from 1998 to 2015. The Mother Jones website lists 51 for the same time period. Mother Jones excludes incidents that occur as part of another crime (like a robbery, a home invasion, or gang activity) and focused on incidents in public places. The Columbia study apparently used a broader definition which yields a considerably higher number. It is important to consider the definition used. The researchers defined a mass shooting as “one event in which four or more individuals were killed by a perpetrator using a firearm and the perpetrator themselves did not count toward the total number of victims.” This sounds reasonable, but it includes targeted attacks, domestic incidents, and other criminal activity. The phrase “mass shooting” invokes the sort of random, public rampage as defined by Mother Jones.
Vermont had the highest rate of mass shooting deaths in this time period, at just below 0.3 per million people. The Gun Violence Archive also uses a broader definition of mass shootings and reports a single incident in Vermont in the time period: a horrific event in which a woman killed three relatives and a social worker after losing custody of her child. While this is undeniably a terrible crime, it’s not typical of the type of crime most Americans think about when discussing mass shootings. This crime also shows how even a single incident in a low population state can substantially distort a dataset when dealing with rare events.
Readers may notice that the rate of mass shootings is presented in terms of “per million people” instead of the customary “per 100,000 people.” This is because mass shootings are fortunately rare. Public mass shootings, the sort of incidents that the public consciousness associates with the term, are even more rare.
That is part of the reason why mass shootings are so difficult to study.
These events are even harder to study when the variables used in the analysis do not measure what they are purported to measure.