The 12th anniversary of September 11 led to a number of articles on terrorism and the counter-terrorism policies that we have been engaged in over the past 12 years. Brad Plumer and Dylan Matthews at Wonkblog both have interesting articles; Plumer summarizes facts about terrorism in the modern world, and Matthews summarizes the recent empirical literature on the efficacy of counter-terror measures. The biggest takeaways for me:
1) The chances of actually dying in a terrorist attack are very low:
In the last five years, the odds of an American being killed in a terrorist attack have been about 1 in 20 million (that’s including both domestic attacks and overseas attacks). As the chart above from the Economist shows, that’s considerably smaller than the risk of dying from many other things, from post-surgery complications to ordinary gun violence to lightning.
2) But, there might be (a slight bit of) endogeneity here: we spend an enormous sum of money every year to prevent terrorist attacks, and it seems as if we have gotten (slightly) better at preventing terrorist attacks over the last 10 years:
However, there are still a large number of attacks that are successful every year (in spite of our counter-terror efforts)–but, the odds of death remain low because most terrorist attacks result in a (relatively) small number of deaths.
3) We have NO IDEA how effective our counter-terrorism measures are:
The seven studies include among them 86 findings about the effectiveness of counterterrorism programs, and those findings are startling. Lum, Kennedy and Sherley report that the average effect of the programs examined was negative. That is, the intervention was found to increase terrorist incidents rather than reduce them. The results varied by the type of intervention, but not in a way that should give us any comfort about our strategy.
First of all, we only have seven valid studies examining the efficacy of our counter-terrorism measures. That’s an absurdly small number. Granted, most of the counter-terror measures we take remain highly classified (cough NSA leaks cough), which means that academics are at a huge disadvantage in measuring the efficacy of counter-terror measures.
On that same note, most of the studies examine only small-potatoes stuff, like metal detectors at airports. To me, it’s much much more complex to measure the efficacy of interventions like using big-data to analyze web communication and track terror cells.
How many terrorist attacks do we prevent every year as a result of these policies? How many of those terror attacks would have otherwise been successfully carried out, even with the counter-terror capabilities that we had in, say, the 90’s? How many additional terror threats do we face each year as a result of our military involvement in Iraq, Afghanistan, and our casual (read: inadequate) oversight of and response to NSA violations of privacy law?
I’ve discussed this topic before, obviously, given the dearth of studies its hard to conclusively make a case one way or another, but my suspicion is that the US doesn’t get a whole lot of bang for its buck in terms of lives saved/$ spent in the counter-terror arena, and that there are lots of other places we could be spending money that would result in a greater number of lives being saved in both the short and long-term.