"Pink Slime" and Cognitive Bias; Or, How to Make Mountains of Molehills

Last year, Nobel prize-winning psychologist Daniel Kahneman published a book titled Thinking, Fast and Slow. The book has won awards and landed on bestseller lists and I recommend it. (It's not an easy read but it's worth the effort.) Kahneman looks at how we "think" and how our intuition and cognitive biases (and we all have them) shape how we think and how we respond to ideas and events In Chapter 13, he discusses the role of "risk" in our thinking, especially the way "biased reactions" and inaccurate  assessments of  risk can and do shape public policy. He cites the work of another scholar, Cass Sunstein, who argues that, as Kahneman puts it,

biased reactions to risks are an important source of erratic and misplaced priorities in public policy. Lawmakers and regulators may be overly responsive to the irrational concerns of citizens, both because of political sensitivity and because they are prone to the same cognitive biases as other citizens.

Kahneman describes what happens when the public's inability to calculate "risk" accurately fuels an irrational response to a relatively trivial matter. The passage is long but worth quoting in full.

[This] . . .  self-sustaining chain of events .  .  . may start from media reports of a relatively minor event and lead up to public panic and large-scale government action. On some occasions, a media story about a risk catches the attention of a segment of the public, which becomes aroused and worried. This emotional reaction becomes a story in itself, prompting additional coverage in the media, which in turn produces greater concern and involvement. The cycle is sometimes sped along deliberately by "availability entrepreneurs," individuals or organizations who work to ensure a continuous flow of worrying news.

The danger is increasingly exaggerated as the media compete for attention-grabbing headlines. Scientists and others who try to dampen the increasing fear and revulsion attract little attention, most of it hostile: anyone who claims that the danger is overstated is suspected of association with a "heinous cover-up." The issue become politically important because it is on everyone's mind, and the response of the political system is guided by the intensity of public sentiment.

Kahneman cites two specific examples of cases where a minor episode blew up into a Very Big Deal and facts were ignored: the Love Canal episode of the late 1970s, and the "Alar scare" in 1989. In both cases, the facts did not support the dangers touted.

To which I would add the example of Pink Slime in 2012: Jamie Oliver describes salvaged beef scraps as dog food; a blogger demands that the USDA stop using said scraps and launches a petition; the media picks up the story; a media-savvy group of "availability entrepreneurs" fan the flames --- and a company ends up teetering on the verge of bankruptcy.

________________

Quotes are from Daniel Kahneman, Thinking, Fast and Slow (New York: Farrar, Straus and Giroux, 2012), p. 142.