In a nuclear world, there exists the possibility that nuclear weapons and their constituent materials could drift out of the control of states and into the hands of terrorists. This statement is uncontroversial and has a distinguished lineage. In 1946, during closed congressional testimony, Robert Oppenheimer opined that a small and determined group could smuggle an atomic bomb into New York with relative impunity. The only defence, he stated, was the humble screwdriver, wielded by an army of inspectors tasked with rummaging through every consignment destined for the heart of that city.
Oppenheimer’s statement started an enduring debate on how best to counter the risk of nuclear terrorism. This poses challenging but important questions for decision-makers. A nuclear terrorist event could cause severe physical destruction, and shattering social and economic impacts, but the cost of effective counter-measures is potentially very high. Since the early 1990s states have committed tens of billions of dollars to strengthening nuclear security systems, both unilaterally and through cooperative programmes such as the Global Partnership. So, what is the likelihood of nuclear terrorism? Has this likelihood been decreased thanks to significant investment in nuclear security measures across the globe? Or has the likelihood increased as a result of the spread of violent transnational extremism? How should policy makers weigh the risk of nuclear terrorism against other defence and security threats? These are exceptionally difficult questions to answer. As such, great care must be taken when constructing conceptual and mathematical models to explore these issues. We tackle the use of mathematical models in this area in our recent paper Nuclear terrorism and virtual risk: implications for prediction and the utility of models published in the European Journal of International Security.
What the experts say
Estimating the likelihood of nuclear terrorism emerged as a parochial battleground in the years following 9/11, with two camps promoting distinct and opposing visions. The first sees nuclear terrorism as a near inevitability requiring immediate action on a global scale. Brian Jenkins’ comment that before 9/11 terrorists wanted “a lot of people watching, not a lot of people dead [whereas today] terrorists want a lot of people watching and a lot of people dead” captures their underlying rationale. Nuclear weapons hold the ultimate cachet for the modern mass-murdering media-conscious bad guy. Barriers to the acquisition of nuclear materials or even complete weapons are perceived as low and the knowledge and technical skill necessary to operationalize them is widely available. Consequently, the likelihood of nuclear terrorism as assessed by this group is very high. Graham Allison, for example, estimates an annual probability of about 50%.
The second camp disagrees vehemently and promotes the opposing perspective: terrorist groups do not need to use nuclear weapons for impactful mass killing, their aspirations are at some variance with reality, and technical barriers to successful deployment of an improvised nuclear device are great enough that the risk is negligible. In other words, the threat of nuclear terrorism is (and has consistently been) overblown. The prominence of the debate has also served to distract poorly informed decision-makers from more pressing defense and security matters. The likelihood of nuclear terrorism as assessed by this group is very low. John Mueller, for example, estimates an annual probability of 0.0000000003%.
The difference between Allison and Mueller’s figures equates to nine orders of magnitude, a factor of a billion. A decision-maker seeking to assess the likelihood of a nuclear terrorist event would struggle to determine which of these estimates (each produced by a nuclear security expert with extensive professional experience) to take on board. Our recent work explains how polarised perspectives can arise in this area and sketches out what can be done to remedy this situation.
Estimating virtual risks
Analysts require two basic qualities to estimate event likelihoods numerically: normative and substantive goodness. Normative goodness is experience with numerical probability estimation, while substantive goodness is relevant knowledge in the field of study that experts can use to form judgements. While the former can be dealt with through training (or playing poker), the latter poses the major problem when the domain in question is nuclear terrorism. The reason for this is the paucity of relevant data. There are no known complete historical attempts at nuclear terrorism upon which analysts can base their judgements. While some relevant information does exist (concerning theft of nuclear materials from research laboratories or the motivation to develop nuclear weapons by members of Al Qaeda) this is limited, incomplete, ambiguous, and often only applicable to a small part of the puzzle.
As a result, nuclear terrorism should be considered as a virtual risk: it is not possible for any expert to have sufficient information to meaningfully ascribe probabilities and, as such, no analyst can attain high substantive goodness. Instead, the numbers that experts assign to potential nuclear terrorist events represent their subjective degree of belief rather than objective or data-based analysis. This is the reason Allison and Mueller’s figures are so distant from each other: the numbers are used to illustrate their opinions, nothing more, nothing less.
Overall, then, decision-makers would do well to avoid the inclusion of numerical likelihood estimates in their considerations. These numbers are veneered with scientific authority, but in reality they represent subjective opinion encoded quantitatively. In the quest for evidence-based policy-making, decision-makers and expert communities alike must learn that there is no panacea. Models can be used to structure thinking, identify areas of disagreement, and direct future research efforts. However, while experts can contort the flimsy evidence to produce numbers that flatter their opinions, the search for objective figures is destined to fail. The nuclear community would do well to (re)watch the famous spoon-bending scene from The Matrix, and heed the advice given to the film’s protagonist therein:
Do not try and bend the spoon. That’s impossible. Instead, only try to realise the truth…There is no spoon…Then you will see that is it not the spoon that bends, it is yourself.
Image: Nuclear Threat Initiative: Nuclear Terrorism: The Threat is Real, via vimeo.