The Psychology of Why 94 Deaths from Terrorism are Scarier than 301,797 Deaths from Guns

This is consistent with what I have been saying on this blog over and over again.

Risk perception (pdf) used to be based on an analytical equation: you multiply the probability of an event by the potential damage of its outcome. But Paul Slovic, a professor of psychology at the University of Oregon, understood the powerful role of emotions in decision-making and altered that equation, noting that many things affect how we perceive risk:

  • do you trust the person you are dealing with
  • control vs. lack of control (lack of control inflates risk perceptions)
  • is it catastrophic or chronic (catastrophic inflates risk perceptions)
  • does it incite dread or anger (dread inflates risk perceptions)
  • uncertainty (lack of knowledge about something inflates risk perceptions)


Confirmation Bias (Part Two) How to Overcome Your Own Pre-Existing Beliefs


In the previous post, we linked to Part One of Sandman’s advice about confirmation bias. Here is an overview of part two:

Now I want to address a different question: how to overcome– well, partly overcome– your own confirmation bias.

I don’t want to sound too Pollyanna here, and I certainly don’t want to sound preachy or holier-than-thou. I have plenty of trouble making myself read articles I know I’m going to disagree with – and I’ve pretty much given up on making myself read them with an open mind.

Even Daniel Kahneman, the godfather of research into confirmation bias and kindred cognitive biases, confesses in his book Thinking Fast and Slow that after decades of study he still regularly falls prey to them.

So don’t expect miracles. Yet some of Kahneman’s research deals directly with strategies for reducing confirmation bias. He and others have found some approaches that help at least a little.


Confirmation Bias (Part One): How to Counter Your Audience’s Pre-Existing Beliefs

Legendary risk communication expert Peter Sandman offers sage advice on how to handle confirmation bias when advocating about risk.

Confirmation bias is the universal tendency of human beings to hang onto what they already believe in the face of evidence to the contrary. You may know it by its endearing nickname, “myside bias,” which nicely captures its essence.

I’m not talking about intentional bias. That happens too. People sometimes go hunting for evidence that they’re right and then intentionally distort what they find, consciously building a biased case in hopes of winning an argument. Confirmation bias is unintentional. It’s how we win our internal arguments, how we convince ourselves we’re right.

Since this is a risk communication column, I want to focus here on the implications of confirmation bias for risk communicators. Your audience members are sure to filter your warnings and reassurances through their own preexisting opinions about what’s safe and what isn’t, resisting anything you say that tries to change their views. How should this fact affect your messaging?


Bruce Schneier on How the Media Influences Our Fear of Terrorism


Bruce Schneier is somewhat of an intellectual celebrity on all matters relating to security, especially information security.Although he doesn’t cite the work of Peter Sandman, what he writes about the psychology of security & risk is consistent with what Sandman says.

He posted the following article to his site one week ago. If you’re interested in terrorism-related risk, you’ll want to check this out. It’s a short read.


How Not to Ride the Risk Communication Seesaw: How Clinton Lost Ambivalent Voters to Trump

I work professionally in information risk management. About a decade ago, I first discovered Peter Sandman. He seems to be the equivalent of Stephen Hawking in risk communication. In July 2016, Sandman diagnosed why the left / liberals have completely failed in their efforts to counter Trump and Brexit. With the benefit of 20/20 hindsight, his column now seems prophetic.

This is somewhat of a long read, but it’s worth the effort no matter where you fall on the political spectrum. If more people followed his advice, it would completely change the national conversation about politics (and for the better).


Risk = Hazard + Outrage, or Why Risk Stats about Toddlers, Lightning, and Guns vs. Terrorists Don’t Work

Philosopher Massimo Pigliucci recently tweeted a comparison of risk stats.


While these statistics are surely correct, they are equally unlikely to have much impact on the public perception of risk. Peter Sandman is an expert in the field of risk communication. One of his contributions is the quasi-formula, “Risk = Hazard + Outrage.” Pigliucci’s statistics are an example of what Sandman calls data about the “hazard.” But there is another set of considerations, what he calls “outrage,” which play a much bigger role in how most people think about risk.

Sandman defines “outrage” as “how upsetting the risky situation is.” He explains as follows:

When I invented the label “outrage” some 30 years ago I had in mind the sort of righteous anger people feel when they suspect a nearby factory is belching carcinogens into the air. But as I use the concept now, it applies to fear-arousing situations as much as anger-arousing situations. High-outrage risks are the risks that tend to upset people, independent of how much harm they’re actually likely to do.

A risk that is voluntary, for example, provokes less outrage than one that’s coerced. A fair risk is less outrage-provoking than an unfair one. Among the other outrage factors link is to a PDF file:

  • Familiar versus exotic
  • Not memorable versus memorable
  • Not dreaded versus dreaded
  • Individually controlled versus controlled by others
  • Trustworthy sources versus untrustworthy sources
  • Responsive process versus unresponsive process


Not only does Sandman’s hazard vs. outrage distinction explain why appeals to comparative risk statistics don’t work, it is actually constructive because it help guides the conversation. Most of Sandman’s website is, in fact, an extended lesson in how to use an awareness of his “outrage factors” to get people’s outrage to better align with the hazard.

I’m not going to spell out how Sandman’s outrage factors play into the debate over Trump’s Executive Order temporarily banning immigration from seven Muslim-majority nations. I’ll leave that as an exercise for the reader. But I predict that, if you apply his techniques, you can both predict the talking points from Trump’s supporters as well as identify areas where Trump’s critics could do a much better job.