Philosophy:Availability cascade

From HandWiki

An availability cascade is a self-reinforcing cycle that explains the development of certain kinds of collective beliefs. A novel idea or insight, usually one that seems to explain a complex process in a simple or straightforward manner, gains rapid currency in the popular discourse by its very simplicity and by its apparent insightfulness. Its rising popularity triggers a chain reaction within the social network: individuals adopt the new insight because other people within the network have adopted it, and on its face it seems plausible. The reason for this increased use and popularity of the new idea involves both the availability of the previously obscure term or idea, and the need of individuals using the term or idea to appear to be current with the stated beliefs and ideas of others, regardless of whether they in fact fully believe in the idea that they are expressing. Their need for social acceptance, and the apparent sophistication of the new insight, overwhelm their critical thinking. The idea of the availability cascade was first developed by Timur Kuran and Cass Sunstein as a variation of information cascades mediated by the availability heuristic, with the addition of reputational cascades.[1] The availability cascade concept has been highly influential in finance theory and regulatory research, particular with respect to assessing and regulating risk.

Cascade elements

Availability cascades occur in a society via public discourse (e.g. the public sphere and the news media) or over social networks—sets of linked actors in one or more of several roles. These actors process incoming information to form their private beliefs according to various rules, both rational and semi-rational. The semi-rational rules include the heuristics, in particular the availability heuristic. The actors then behave and express their public beliefs according to self-interest, which might cause their publicly expressed beliefs to deviate from their privately held beliefs.

Kuran and Sunstein emphasize the role of availability entrepreneurs, agents willing to invest resources into promoting a belief in order to derive some personal benefit. Other availability entrepreneurs with opposing interests may wage availability counter-campaigns. Other key roles include journalists and politicians, both of which are subject to economic and reputational pressures, the former in competition in the media, the latter for political status. As resources (e.g. attention and money) are limited, beliefs compete with one another in the "availability market". A given incident and subsequent availability campaign may succeed in raising the availability of one issue at the expense of other issues.[1]

Belief formation

Dual process theory posits that human reasoning is divided into two systems, often called System 1 and System 2. System 1 is automatic and unconscious; other terms used for it include the implicit system, the experiential system, the associative system, and the heuristic system. System 2 is evolutionarily recent and specific to humans, performing the more slow and sequential thinking. It is also known as the explicit system, the rule-based system, the rational system, or the analytic system. In The Happiness Hypothesis, Jonathan Haidt refers to System 1 and System 2 as the elephant and the rider: while human beings incorporate reason into their beliefs, whether via direct use of facts and logic or their application as a test to hypotheses formed by other means, it is the elephant that is really in charge.

Cognitive biases

Heuristics are simple, efficient rules which people often use to form judgments and make decisions. They are mental shortcuts that replace a complex problem with a simpler one. These rules work well under most circumstances, but they can lead to systematic deviations from logic, probability or rational choice theory. The resulting errors are called "cognitive biases" and many different types have been documented. These have been shown to affect people's choices in situations like valuing a house or deciding the outcome of a legal case. Heuristics usually govern automatic, intuitive judgments but can also be used as deliberate mental strategies when working from limited information. While seemingly irrational, the cognitive biases may be interpreted as the result of bounded rationality, with human beings making decisions while economizing time and effort.

Kuran and Sunstein describe the availability heuristic as more fundamental than the other heuristics: besides being important in its own right, it enables and amplifies the others, including framing, representativeness, anchoring, and reference points.[1]

Availability heuristic

Even educated human beings are notoriously poor at thinking statistically.[2] The availability heuristic, first identified by Daniel Kahneman and Amos Tversky, is a mental shortcut that occurs when people judge the probability of events by how easy it is to think of examples. The availability heuristic operates on the notion that, "if you can think of it, it must be important." Availability can be influenced by the emotional power of examples and by their perceived frequency; while personal, first-hand incidents are more available than those that happened to others, availability can be skewed by the media. In his book Thinking, Fast and Slow, Kahneman cites the examples of celebrity divorces and airplane crashes; both are more often reported by the media, and thus tend to be exaggerated in perceived frequency.[3]

Examples

An important class of judgments is those concerning risk: the expectation of harm to result from a given threat, a function of the threat's likelihood and impact. Changes in perceived risk result in risk compensation—correspondingly more or less mitigation, including precautionary measures and support for regulation. Kuran and Sunstein offer three examples of availability cascades—Love Canal, the Alar scare, and TWA Flight 800—in which a spreading public panic led to growing calls for increasingly expensive government action to deal with risks that turned out later to be grossly exaggerated.[1] Others have used the term "culture of fear" to refer to the habitual achieving of goals via such fear appeals, notably in the case of the threat of terrorism.

Disease threats

In the early years of the HIV/AIDS epidemic, many believed that the disease received less attention than warranted, in part due to the stigma attached to its sufferers. Since that time advocates— availability entrepreneurs that include LGBT activists and conservative Surgeon General of the United States C. Everett Koop—have succeeded in raising awareness to achieve significant funding. Similarly, awareness and funding for breast cancer and prostate cancer are high, thanks in part to the availability of these diseases. Other prevalent diseases competing for funding but lacking the availability of HIV/AIDS or cancer include lupus, sickle-cell anemia, and tuberculosis.[4]

Vaccination scares

The MMR vaccine controversy was an example of an unwarranted health scare. It was triggered by the publication in 1998 of a paper in the medical journal The Lancet which presented apparent evidence that autism spectrum disorders could be caused by the MMR vaccine, an immunization against measles, mumps and rubella.[5] In 2004, investigations by Sunday Times journalist Brian Deer revealed that the lead author of the article, Andrew Wakefield, had multiple undeclared conflicts of interest,[6] had manipulated evidence,[7] and had broken other ethical codes. The Lancet paper was partially retracted in 2004 and fully retracted in 2010, and Wakefield was found guilty of professional misconduct. The scientific consensus is that no evidence links the vaccine to the development of autism, and that the vaccine's benefits greatly outweigh its risks. The claims in Wakefield's 1998 The Lancet article were widely reported;[8] vaccination rates in the UK and Ireland dropped sharply,[9] which was followed by significantly increased incidence of measles and mumps, resulting in deaths and severe and permanent injuries.[10] Reaction to vaccine controversies has contributed to a significant increase in preventable diseases including measles[11] and pertussis (whooping cough), which in 2011 experienced its worst outbreak in 70 years as a result of reduced vaccination rates.[12] Concerns about immunization safety often follow a pattern: some investigators suggest that a medical condition is an adverse effect of vaccination; a premature announcement is made of the alleged adverse effect; the initial study is not reproduced by other groups; and finally, it takes several years to regain public confidence in the vaccine.[13]

Global warming

Extreme weather events provide opportunities to raise the availability of global warming. In the United States, the mass media devoted little coverage to global warming until the drought of 1988, and the testimony of James E. Hansen to the United States Senate, which explicitly attributed "the abnormally hot weather plaguing our nation" to global warming.[14] The global warming controversy has attracted availability entrepreneurs on both sides, e.g. the book Merchants of Doubt claiming that scientific consensus had long ago been reached, and climatologist Patrick Michaels providing the denialist viewpoint.

Gun violence

The media inclination to sensationalism results in a tendency to devote disproportionate coverage to sympathetic victims (e.g. missing white woman syndrome), terrifying assailants (e.g. Media coverage of the Virginia Tech massacre), and incidents with multiple victims. Although half the victims of gun violence in the United States are black, generally young urban black males,[15] media coverage and public awareness spike after suburban school shootings, as do calls for stricter gun control laws.

International adoption scandals

International adoption scandals receive disproportionate attention in the countries of adoptees' origins. As the incidents involve abuse of children, they easily spark media attention, and availability entrepreneurs (e.g. populist politicians) fan the flames of xenophobia, without making statistical comparisons of adoptee abuse in the source and target nations, or of the likelihood of abuse vs. other risks.[16]

Poisoned candy myths

Poisoned candy myths are urban legends that malevolent individuals could hide poison or drugs, or sharp objects such as razor blades, needles, or broken glass in candy and distribute the candy in order to harm random children, especially during Halloween trick-or-treating. Several events fostered the candy tampering myth. The first took place in 1964, when an annoyed Long Island, New York housewife started giving out packages of inedible objects to children who she believed were too old to be trick-or-treating. The packages contained items such as steel wool, dog biscuits, and ant buttons (which were clearly labeled with the word "poison"). Although nobody was injured, she was prosecuted and pleaded guilty to endangering children. The same year saw reports of lye-filled bubble gum being handed out in Detroit and rat poison being given in Philadelphia.[17]

The second milestone in the spread of the candy-tampering myths was an article published in The New York Times in 1970. It claimed that "Those Halloween goodies that children collect this weekend on their rounds of ‘trick or treating’ may bring them more horror than happiness", and provided specific examples of potential tampering.[18]

In 2008, candy was found with metal shavings and metal blades embedded in it. The candy was Pokémon Valentine's Day lollipops purchased from a Dollar General store in Polk County, Florida. The candy was determined to have been manufactured in China and not tampered with within the United States . The lollipops were pulled from the shelves after a mother reported a blade in her child's lollipop and after several more lollipops with metal shavings in them were confiscated from a local elementary school.[19] Also in 2008, some cold medicine was discovered in cases of Smarties that were handed out to children in Ontario.[20]

Over the years, various experts have tried to debunk the various candy tampering stories. Among this group is Joel Best, a University of Delaware sociologist who specializes in investigating candy tampering legends. In his studies, and the book Threatened Children: Rhetoric and Concern about Child-Victims, he researched newspapers from 1958 on in search of candy tampering.[21] Of these stories, fewer than 90 instances might have qualified as actual candy tampering. Best has found five child deaths that were initially thought by local authorities to be caused by homicidal strangers, but none of those were sustained by investigation.[22]

Despite the falsity of these claims, the news media promoted the story continuously throughout the 1980s, with local news stations featuring frequent coverage. During this time, cases of poisoning were repeatedly reported based on unsubstantiated claims or before a full investigation could be completed and often never followed up on. This one-sided coverage contributed to the overall panic and caused rival media outlets to issue reports of candy tampering as well. By 1985, the media had driven the hysteria about candy poisonings to such a point that an ABC News/The Washington Post poll that found 60% of parents feared that their children would be injured or killed because of Halloween candy sabotage.

Media feeding frenzy

The phenomenon of media feeding frenzies is driven by a combination of the psychology described by the availability cascade model and the financial imperatives of media organizations to retain their funding.

Policy implications

Technocracy vs. democracy

There are two schools of thought on how to cope with risks raised by availability cascades: technocratic and democratic. The technocratic approach, championed by Kuran and Sunstein, emphasizes assessing, prioritizing, and mitigating risks according to objective risk measures (e.g. expected costs, expected disability-adjusted life years (DALY)). The technocratic approach considers availability cascades to be phenomena of mass irrationality that can distort or hijack public policy, misallocating resources or imposing regulatory burdens whose costs exceed the expected costs of the risks they mitigate.

The democratic approach, championed by Paul Slovic, respects risk preferences as revealed by the availability market. For example, though lightning strikes kill far more people each year than shark attacks, if people genuinely consider death by shark worse than death by lightning, a disproportionate share of resources should be devoted to averting shark attacks.

Institutional safeguards

Kuran and Sunstein recommend that availability cascades be recognized, and institutional safeguards be implemented in all branches of government. They recommend expanded product defamation laws, analogous to personal libel laws, to discourage availability entrepreneurs from knowingly spreading false and damaging reports about a product. They recommend that the legislative branch create a Risk Regulation Committee to assess risks in a broader context and perform cost-benefit analyses of risks and regulations, avoiding hasty responses pandering to public opinion. They recommend that the executive branch use peer review to open agency proposals to scrutiny by informed outsiders. They also recommend the creation of a Risk Information Center with a Risk Information Web Site to provide the public with objective risk measures.[1] In the United States, the Centers for Disease Control and Prevention[23] and the Federal Bureau of Investigation[24] maintain web sites that provide objective statistics on the causes of death and violent crime.

References

  1. 1.0 1.1 1.2 1.3 1.4 Kuran, Timur, and Sunstein, Cass, Availability Cascades and Risk Regulation, Stanford Law Review, Vol. 51, No. 4 (1999).
  2. Lehrer, Jonah (2012-06-12). "Why Smart People Are Stupid". http://www.newyorker.com/online/blogs/frontal-cortex/2012/06/daniel-kahneman-bias-studies.html. 
  3. Daniel Kahneman (2011-10-25). Thinking, Fast and Slow. Macmillan. ISBN 978-1-4299-6935-2. https://books.google.com/books?id=ZuKTvERuPG8C. Retrieved 2013-02-12. 
  4. Brower, Vicki (2005). "The squeaky wheel gets the grease". EMBO Reports 6 (11): 1014–1017. doi:10.1038/sj.embor.7400564. PMID 16264425. 
  5. "Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children". Lancet 351 (9103): 637–41. 1998. doi:10.1016/S0140-6736(97)11096-0. PMID 9500320. http://briandeer.com/mmr/lancet-paper.htm. Retrieved 2007-09-05.  (Retracted, see doi:10.1016/S0140-6736(10)60175-4, PMID 20137807)
  6. The Sunday Times 2004:
  7. Deer B (8 February 2009). "MMR doctor Andrew Wakefield fixed data on autism". The Sunday Times (London). http://www.timesonline.co.uk/tol/life_and_style/health/article5683671.ece. Retrieved 2009-02-09. 
  8. Goldacre B (30 August 2008). "The MMR hoax". The Guardian (London). Archived from the original on 6 February 2015. https://web.archive.org/web/20150206073230/http://www.theguardian.com/society/2008/aug/30/mmr.health.media. Retrieved 2008-08-30.  Alt URL
  9. "Improving uptake of MMR vaccine". BMJ 336 (7647): 729–30. 2008. doi:10.1136/bmj.39503.508484.80. PMID 18309963. 
  10. Pepys MB (2007). "Science and serendipity". Clin Med 7 (6): 562–78. doi:10.7861/clinmedicine.7-6-562. PMID 18193704. 
  11. Boston Children's Hospital
  12. Heffter, Emily (June 2, 2011). "State leads nation in kids who aren't getting vaccines". Seattle Times. Archived from the original on January 16, 2013. https://web.archive.org/web/20130116163620/http://seattletimes.com/html/localnews/2015215221_vaccines03m.html. Retrieved September 28, 2012. 
  13. "Adverse events following immunization: perception and evidence". Current Opinion in Infectious Diseases 20 (3): 237–46. 2007. doi:10.1097/QCO.0b013e32811ebfb0. PMID 17471032. 
  14. McCright, A.M.; Dunlap R.E. (2000). "Challenging global warming as a social problem: An analysis of the conservative movement's counter-claims". Social Problems 47 (4): 499–522. doi:10.1525/sp.2000.47.4.03x0305s. http://www.climateaccess.org/sites/default/files/McCright_Challenging%20Global%20Warming.pdf.  See p. 500.
  15. "Expanded Homicide Data". Uniform Crime Reports. 2010. https://www.fbi.gov/about-us/cjis/ucr/crime-in-the-u.s/2010/crime-in-the-u.s.-2010/offenses-known-to-law-enforcement/expanded/expandhomicidemain. 
  16. Montgomery, Mark; Powell, Irene (2018-03-01). "International adoptions have dropped 72 percent since 2005 – here's why". The Conversation. https://theconversation.com/international-adoptions-have-dropped-72-percent-since-2005-heres-why-91809. 
  17. "Deadly 'Tricks' Given Children in 3 States". The Milwaukee Journal. United Press International: p. A18. November 2, 1964. 
  18. Klemesrud, Judy (October 28, 1970). "Those Treats May Be Tricks". The New York Times: p. 56. 
  19. "Metal-Filled Lollipops Seized By Deputies At Elementary School - Orlando News Story - WKMG Orlando". Local6.com. February 14, 2008. Archived from the original on April 20, 2008. https://web.archive.org/web/20080420115645/http://www.local6.com/news/15304726/detail.html. Retrieved July 16, 2009. 
  20. "Cold medication discovered in Halloween candy". CBC. November 7, 2008. http://www.cbc.ca/health/story/2008/11/07/smarties-tainted.html. Retrieved November 8, 2008. 
  21. Best, Joel (1993). Threatened children : rhetoric and concern about child-victims. Chicago: University of Chicago Press. ISBN 0226044262. http://www.worldcat.org/search?qt=wikipedia&q=isbn%3A0226044262. 
  22. Best, Joel; Gerald T. Horiuchi (1985). "The Razor Blade in the Apple: The Social Construction of Urban Legends". Social Problems 32 (5): 488–99. doi:10.2307/800777. 
  23. "CDC Web-based Injury Statistics Query and Reporting System". 9 February 2023. https://www.cdc.gov/injury/wisqars/. 
  24. "FBI Uniform Crime Reports". https://www.fbi.gov/about-us/cjis/ucr/ucr. 

See also