Philosophy:Scope neglect

From HandWiki
Revision as of 07:48, 5 February 2024 by Gametune (talk | contribs) (simplify)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Cognitive bias

Scope neglect or scope insensitivity is a cognitive bias that occurs when the valuation of a problem is not valued with a multiplicative relationship to its size. Scope neglect is a specific form of extension neglect.[1]

In one study, respondents were asked how much they were willing to pay to prevent migrating birds from drowning in uncovered oil ponds by covering the oil ponds with protective nets. Subjects were told that either 2,000, or 20,000, or 200,000 migrating birds were affected annually, for which subjects reported they were willing to pay $80, $78 and $88 respectively.[2] Other studies of willingness-to-pay to prevent harm have found a logarithmic relationship or no relationship to scope size.[3]

The psychologist Daniel Kahneman explains scope neglect in terms of judgment by prototype, a refinement of the representativeness heuristic. "The story [...] probably evokes for many readers a mental representation of a prototypical incident, perhaps an image of an exhausted bird, its feathers soaked in black oil, unable to escape,"[4] and subjects based their willingness-to-pay mostly on that mental image.

Psychologist Paul Slovic has conducted research on the phenomenon of mass numbing which is closely linked to scope neglect. Mass numbing occurs when individuals cannot properly conceptualize harms affecting a large number of people and give these harms less importance than the same harm occurring to one identifiable person.[5]

Applications of scope neglect

Philosopher Toby Ord argues that scope neglect can explain why the moral importance of existential risks to humanity is frequently underweighted relative to the stakes involved.[6] In his book The Precipice: Existential Risk and the Future of Humanity, Ord refers explicitly to scope neglect and provides the following example for the bias:[7]

[W]e tend to treat nuclear war as an utter disaster, so we fail to distinguish nuclear wars between nations with a handful of nuclear weapons (in which millions would die) from a nuclear confrontation with thousands of nuclear weapons (in which a thousand times as many people would die, and our entire future may be destroyed).

Eliezer Yudkowsky previously made similar remarks regarding the effect of scope neglect on the public perception of existential risks.[8]

See also

References

  1. Kahneman, Daniel (2000). "Evaluation by moments, past and future". in Daniel Kahneman. Choices, Values and Frames. pp. 708. 
  2. Desvouges, William H.; Johnson, Reed; Dunford, Richard; Boyle, Kevin; Hudson, Sarah; Wilson, K. Nicole (1992). "Measuring Non-Use Damages Using Contingent Valuation: An Experimental Evaluation of Accuracy". Research Triangle Institute Monograph 92-1. doi:10.3768/rtipress.2009.bk.0001.1009. 
  3. Kahneman, Daniel; Ritov, Ilana; Schkade, Daniel (1999). "Economic Preferences or Attitude Expressions?: An Analysis of Dollar Responses to Public Issues". Journal of Risk and Uncertainty 19: 203–235. doi:10.1007/978-94-017-1406-8_8. ISBN 978-90-481-5776-1. http://www.geog.ucsb.edu/~deutsch/geog111_211a/papers/Kahneman%20on%20attitude%20and%20affection.pdf. 
  4. p. 212
  5. Slovic, Paul (2007). ""If I look at the mass I will never act": Psychic numbing and genocide". Judgment and Decision Making 2 (2): 79–95. doi:10.1017/S1930297500000061. 
  6. Purtill, Corinne. "How Close Is Humanity to the Edge?" (in en-us). The New Yorker. https://www.newyorker.com/culture/annals-of-inquiry/how-close-is-humanity-to-the-edge. Retrieved 2020-11-27. 
  7. Ord, Toby (2020). The Precipice: Existential Risk and the Future of Humanity. United Kingdom: Bloomsbury Publishing. pp. 67. ISBN 978-1526600219. 
  8. Yudkowsky, Eliezer. "Cognitive biases potentially affecting judgment of global risks". Global catastrophic risks 1 (2008): 86. p.114