Philosophy:Normalcy bias

From HandWiki
Revision as of 06:49, 5 February 2024 by Wikisleeper (talk | contribs) (update)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Disbelief or minimization in response to threat warnings

Normalcy bias, or normality bias, is a cognitive bias which leads people to disbelieve or minimize threat warnings.[1] Consequently, individuals underestimate the likelihood of a disaster, when it might affect them, and its potential adverse effects.[2] The normalcy bias causes many people to prepare inadequately for natural disasters, market crashes, and calamities caused by human error. About 80% of people reportedly display normalcy bias during a disaster.[3]

The normalcy bias can manifest in response to warnings about disasters and actual catastrophes. Such events can range in scale from incidents such as traffic collisions to global catastrophic risk. The event may involve social constructionism phenomena such as loss of money in market crashes, or direct threats to continuity of life: as in natural disasters like a tsunami or violence in war.

Normalcy bias has also been called analysis paralysis, the ostrich effect,[4] and by first responders, the negative panic.[5] The opposite of normalcy bias is overreaction, or worst-case scenario bias,[6][7] in which small deviations from normality are dealt with as signals of an impending catastrophe.

Phases

Amanda Ripley, author of The Unthinkable: Who Survives When Disaster Strikes – and Why, identifies common response patterns of people in disasters and explains that there are three phases of response: "denial, deliberation, and the decisive moment". With regard to the first phase, described as "denial", Ripley found that people were likely to deny that a disaster was happening. It takes time for the brain to process information and recognize that a disaster is a threat. In the "deliberation" phase, people have to decide what to do. If a person does not have a plan in place, this causes a serious problem because the effects of life-threatening stress on the body (e.g. tunnel vision, audio exclusion, time dilations, out-of-body experiences, or reduced motor skills) limit an individual's ability to perceive information and make plans. Ripley asserts that in the third and final phase, described as the "decisive moment", a person must act quickly and decisively. Failure to do so can result in injury or death. She explains that the faster someone can get through the denial and deliberation phases, the quicker they will reach the decisive moment and begin to take action.[8]

Examples

Normalcy bias can occur during car crashes.

Journalist David McRaney wrote that "Normalcy bias flows into the brain no matter the scale of the problem. It will appear whether you have days and plenty of warning or are blindsided with only seconds between life and death."[9] It can manifest itself in phenomena such as car crashes. Car crashes occur very frequently, but the average individual experiences them only rarely, if ever. It also manifests itself in connection with events in world history. According to a 2001 study by sociologist Thomas Drabek, when people are asked to leave in anticipation of a disaster, most check with four or more sources of information before deciding what to do. The process of checking in, known as milling, is common in disasters.[10]

1827 illustration of the eruption of Vesuvius in 79 CE

As for events in world history, the normalcy bias can explain why, when the volcano Vesuvius erupted in 79 CE, the residents of Pompeii watched for hours without evacuating.[11] It can explain why thousands of people refused to leave New Orleans as Hurricane Katrina approached[12] and why at least 70% of 9/11 survivors spoke with others before evacuating.[10] Officials at the White Star Line made insufficient preparations to evacuate passengers on the Titanic and people refused evacuation orders, possibly because they underestimated the odds of a worst-case scenario and minimized its potential impact.[13] Similarly, experts connected with the Fukushima nuclear power plant were strongly convinced that a multiple reactor meltdown could never occur.[14]

A website for police officers has noted that members of that profession have "all seen videos of officers who were injured or killed while dealing with an ambiguous situation, like the old one of a father with his young daughter on a traffic stop". In the video referred to, "the officer misses multiple threat cues...because the assailant talks lovingly about his daughter and jokes about how packed his minivan is. The officer only seems to react to the positive interactions, while seeming to ignore the negative signals. It's almost as if the officer is thinking, 'Well I've never been brutally assaulted before so it certainly won't happen now.' No one is surprised at the end of the video when the officer is violently attacked, unable to put up an effective defense." This professional failure, notes the website, is a consequence of normalcy bias.[15]

Normalcy bias, David McRaney has written, "is often factored into fatality predictions in everything from ship sinkings to stadium evacuations". Disaster movies, he adds, "get it all wrong. When you and others are warned of danger, you don't evacuate immediately while screaming and flailing your arms." McRaney notes that in the book Big Weather, tornado chaser Mark Svenvold discusses "how contagious normalcy bias can be. He recalled how people often tried to convince him to chill out while fleeing from impending doom. Even when tornado warnings were issued, people assumed it was someone else's problem. Stake-holding peers, he said, would try to shame him into denial so they could remain calm. They didn't want him deflating their attempts at feeling normal".[9]

Hypothesized cause

The normalcy bias may be caused in part by the way the brain processes new data. Research suggests that even when the brain is calm, it takes 8–10 seconds to process new information. Stress slows the process, and when the brain cannot find an acceptable response to a situation, it fixates on a single and sometimes default solution that may or may not be correct. An evolutionary reason for this response could be that paralysis gives an animal a better chance of surviving an attack and predators are less likely to see prey that is not moving.[10]

Effects

About 80% of people reportedly display normalcy bias in disasters.[3] Normalcy bias has been described as "one of the most dangerous biases we have". The lack of preparation for disasters often leads to inadequate shelter, supplies, and evacuation plans. Even when all these things are in place, individuals with a normalcy bias often refuse to leave their homes.[16][17]

Normalcy bias can cause people to drastically underestimate the effects of the disaster. Therefore, people think that they will be safe even though information from the radio, television, or neighbors gives them reasons to believe there is a risk. The normalcy bias causes a cognitive dissonance that people then must work to eliminate. Some manage to eliminate it by refusing to believe new warnings coming in and refusing to evacuate (maintaining the normalcy bias), while others eliminate the dissonance by escaping the danger. The possibility that some people may refuse to evacuate causes significant problems in disaster planning.[18]

Prevention

The negative effects of normalcy bias can be combated through the four stages of disaster response:[19]

  • preparation, including publicly acknowledging the possibility of disaster and forming contingency plans.
  • warning, including issuing clear, unambiguous, and frequent warnings and helping the public to understand and believe them.
  • impact, the stage at which the contingency plans take effect and emergency services, rescue teams, and disaster relief teams work in tandem.
  • aftermath, reestablishing equilibrium after the fact, by providing both supplies and aid to those in need.

See also

References

  1. Drabek, Thomas E. (1986). Human system responses to disaster : an inventory of sociological findings. New York: Springer Verlag. p. 72. ISBN 978-1-4612-4960-3. OCLC 852789578. "The initial response to a disaster warning is disbelief." 
  2. Omer, Haim; Alon, Nahman (April 1994). "The continuity principle: A unified approach to disaster and trauma" (in en). American Journal of Community Psychology 22 (2): 275–276. doi:10.1007/BF02506866. PMID 7977181. "... normalcy bias consists in underestimating the probability of disaster, or the disruption involved in it ...". 
  3. 3.0 3.1 Inglis-Arkell, Esther (May 2, 2013). "The frozen calm of normalcy bias". Gizmodo. http://io9.gizmodo.com/the-frozen-calm-of-normalcy-bias-486764924.  Cites:
  4. Ince, Wyne (October 23, 2017). Thoughts of Life and Time. Wyne Ince. p. 122. ISBN 978-1-973727-15-6. https://books.google.com/books?id=3AY7DwAAQBAJ&q=%22Normalcy+bias%22+%22ostrich+effect%22&pg=PA122. Retrieved 20 December 2017. 
  5. McRaney, David (2012). You Are Not So Smart: Why You Have Too Many Friends on Facebook, Why Your Memory Is Mostly Fiction, and 46 Other Ways You're Deluding Yourself. Gotham Books. p. 54. ISBN 978-1-59240-736-1. https://books.google.com/books?id=Dj_ZCwAAQBAJ&q=%22You+Are+Not+So+Smart%22+David+McRaney+%22Normalcy+bias+flows+into+the+brain+no+matter+the+scale+of+the+problem,%22&pg=PA55. Retrieved 20 December 2017. 
  6. Schneier, Bruce. "Worst-case thinking makes us nuts, not safe", CNN, May 12, 2010 (retrieved April 18, 2014); reprinted in Schneier on Security, May 13, 2010 (retrieved April 18, 2014)
  7. Evans, Dylan. "Nightmare Scenario: The Fallacy of Worst-Case Thinking", Risk Management, April 2, 2012 (retrieved April 18, 2014); from Risk Intelligence: How To Live With Uncertainty, by Dylan Evans, Free Press/Simon & Schuster, Inc., 2012; ISBN:9781451610901
  8. Ripley, Amanda (June 10, 2008). The Unthinkable: Who Survives When Disaster Strikes – and Why. Potter/Ten Speed Press/Harmony. ISBN 978-0-307-44927-6. 
  9. 9.0 9.1 McRaney, David (2012). You Are Not So Smart: Why You Have Too Many Friends on Facebook, Why Your Memory Is Mostly Fiction, and 46 Other Ways You're Deluding Yourself. Gotham Books. p. 55. ISBN 978-1-59240-736-1. https://books.google.com/books?id=Dj_ZCwAAQBAJ. 
  10. 10.0 10.1 10.2 Ripley, Amanda (25 April 2005). "How to Get Out Alive". Time (TIME Magazine) 165 (18): 58–62. PMID 16128022. http://content.time.com/time/magazine/article/0,9171,1053663-1,00.html. Retrieved 11 November 2013. 
  11. Estelita, Vaz; Joanaz de Melo, Cristina; Costa Pinto, Lígia (2017). Environmental History in the Making. Springer Publishing. ISBN 978-3-319-41085-2. https://books.google.com/books?id=bXVCDQAAQBAJ. 
  12. Strandberg, Todd. "The Normalcy Bias and Bible Prophecy". Prophezine. http://www.prophezine.com/index.php?option=com_content&id=134:the-normalcy-bias-and-bible-prophecy. 
  13. Hoffman, Bryce (May 16, 2017). Red Teaming: How Your Business Can Conquer the Competition by Challenging Everything. Crown Publishing. p. 80. ISBN 978-1-101-90597-5. https://books.google.com/books?id=176mDAAAQBAJ. 
  14. Saito, William (April 20, 2017). "What Fukushima Disaster Taught Me About Risk Management In Cybersecurity". Forbes. https://www.forbes.com/sites/williamsaito/2017/04/20/what-fukushima-disaster-taught-me-about-risk-management-in-cybersecurity/#2f8a9537681e. 
  15. Smith, Dave (20 August 2015). "Normalcy Bias". Police The Law Enforcement Magazine. https://www.policemag.com/patrol/article/15346884/normalcy-bias. 
  16. "Beware Your Dangerous Normalcy Bias". Gerold Blog. 2013-04-27. https://geroldblog.com/2013/04/26/beware-your-dangerous-normalcy-bias/. 
  17. "Disaster Prep for the Rest of Us: normalcy bias" (in en). https://theworldlink.com/bandon/opinion/editorial/disaster-prep-for-the-rest-of-us-normalcy-bias/article_71ca34d0-4527-5da8-8526-e6cd544c473d.html. 
  18. Oda, Katsuya. "Information Technology for Advancement of Evacuation". National Institute for Land and Infrastructure Management. http://www.ysk.nilim.go.jp/kakubu/engan/engan/taigai/hapyoronbun/07-17.pdf. 
  19. Valentine, Pamela V.; Smith, Thomas Edward (2002). "Finding Something to Do: The Disaster Continuity Care Model". Brief Treatment and Crisis Intervention 2 (2): 183–196. doi:10.1093/brief-treatment/2.2.183. https://www.researchgate.net/publication/31145336.