Heuristic

From HandWiki
Revision as of 20:07, 6 February 2024 by StanislovAI (talk | contribs) (url)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Problem-solving method that is sufficient for immediate solutions or approximations


A heuristic (/hjʊˈrɪstɪk/; from grc εὑρίσκω (heurískō) 'to find, discover'), or heuristic technique, is any approach to problem solving or self-discovery that employs a practical method that is not guaranteed to be optimal, perfect, or rational, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.[1][2]

Examples that employ heuristics include using trial and error, a rule of thumb or an educated guess.

Heuristics are the strategies derived from previous experiences with similar problems. These strategies depend on using readily accessible, though loosely applicable, information to control problem solving in human beings, machines and abstract issues.[3][4] When an individual applies a heuristic in practice, it generally performs as expected. However it can alternatively create systematic errors.[5]

The most fundamental heuristic is trial and error, which can be used in everything from matching nuts and bolts to finding the values of variables in algebra problems. In mathematics, some common heuristics involve the use of visual representations, additional assumptions, forward/backward reasoning and simplification. Here are a few commonly used heuristics from George Pólya's 1945 book, How to Solve It:[6]

  • If you are having difficulty understanding a problem, try drawing a picture.
  • If you can't find a solution, try assuming that you have a solution and seeing what you can derive from that ("working backward").
  • If the problem is abstract, try examining a concrete example.
  • Try solving a more general problem first (the "inventor's paradox": the more ambitious plan may have more chances of success).

In psychology, heuristics are simple, efficient rules, either learned or inculcated by evolutionary processes. These psychological heuristics have been proposed to explain how people make decisions, come to judgements, and solve problems. These rules typically come into play when people face complex problems or incomplete information. Researchers employ various methods to test whether people use these rules. The rules have been shown to work well under most circumstances, but in certain cases can lead to systematic errors or cognitive biases.[7]

History

The study of heuristics in human decision-making was developed in the 1970s and the 1980s, by the psychologists Amos Tversky and Daniel Kahneman,[8] although the concept had been originally introduced by the Nobel laureate Herbert A. Simon. Simon's original primary object of research was problem solving that showed that we operate within what he calls bounded rationality. He coined the term satisficing, which denotes a situation in which people seek solutions, or accept choices or judgements, that are "good enough" for their purposes although they could be optimised.[9]

Rudolf Groner analysed the history of heuristics from its roots in ancient Greece up to contemporary work in cognitive psychology and artificial intelligence,[10] proposing a cognitive style "heuristic versus algorithmic thinking", which can be assessed by means of a validated questionnaire.[11]

Adaptive toolbox

Gerd Gigerenzer and his research group argued that models of heuristics need to be formal to allow for predictions of behavior that can be tested.[12] They study the fast and frugal heuristics in the "adaptive toolbox" of individuals or institutions, and the ecological rationality of these heuristics; that is, the conditions under which a given heuristic is likely to be successful.[13] The descriptive study of the "adaptive toolbox" is done by observation and experiment, while the prescriptive study of ecological rationality requires mathematical analysis and computer simulation. Heuristics – such as the recognition heuristic, the take-the-best heuristic and fast-and-frugal trees – have been shown to be effective in predictions, particularly in situations of uncertainty. It is often said that heuristics trade accuracy for effort but this is only the case in situations of risk. Risk refers to situations where all possible actions, their outcomes and probabilities are known. In the absence of this information, that is under uncertainty, heuristics can achieve higher accuracy with lower effort.[14] This finding, known as a less-is-more effect, would not have been found without formal models. The valuable insight of this program is that heuristics are effective not despite their simplicity – but because of it. Furthermore, Gigerenzer and Wolfgang Gaissmaier found that both individuals and organisations rely on heuristics in an adaptive way.[15]

Cognitive-experiential self-theory

Heuristics, through greater refinement and research, have begun to be applied to other theories, or be explained by them. For example, the cognitive-experiential self-theory (CEST) is also an adaptive view of heuristic processing. CEST breaks down two systems that process information. At some times, roughly speaking, individuals consider issues rationally, systematically, logically, deliberately, effortfully, and verbally. On other occasions, individuals consider issues intuitively, effortlessly, globally, and emotionally.[16] From this perspective, heuristics are part of a larger experiential processing system that is often adaptive, but vulnerable to error in situations that require logical analysis.[17]

Attribute substitution

In 2002, Daniel Kahneman and Shane Frederick proposed that cognitive heuristics work by a process called attribute substitution, which happens without conscious awareness.[18] According to this theory, when somebody makes a judgement (of a "target attribute") that is computationally complex, a more easily calculated "heuristic attribute" is substituted. In effect, a cognitively difficult problem is dealt with by answering a rather simpler problem, without being aware of this happening.[18] This theory explains cases where judgements fail to show regression toward the mean.[19] Heuristics can be considered to reduce the complexity of clinical judgments in health care.[20]

Psychology

Philosophy

A heuristic device is used when an entity X exists to enable understanding of, or knowledge concerning, some other entity Y.

A good example is a model that, as it is never identical with what it models, is a heuristic device to enable understanding of what it models. Stories, metaphors, etc., can also be termed heuristic in this sense. A classic example is the notion of utopia as described in Plato's best-known work, The Republic. This means that the "ideal city" as depicted in The Republic is not given as something to be pursued, or to present an orientation-point for development. Rather, it shows how things would have to be connected, and how one thing would lead to another (often with highly problematic results), if one opted for certain principles and carried them through rigorously.

Heuristic is also often used as a noun to describe a rule of thumb, procedure, or method.[21] Philosophers of science have emphasised the importance of heuristics in creative thought and the construction of scientific theories.[22] Seminal works include Karl Popper's The Logic of Scientific Discovery and others by Imre Lakatos,[23] Lindley Darden, and William C. Wimsatt.

Law

In legal theory, especially in the theory of law and economics, heuristics are used in the law when case-by-case analysis would be impractical, insofar as "practicality" is defined by the interests of a governing body.[24]

The present securities regulation regime largely assumes that all investors act as perfectly rational persons. In truth, actual investors face cognitive limitations from biases, heuristics, and framing effects. For instance, in all states in the United States the legal drinking age for unsupervised persons is 21 years, because it is argued that people need to be mature enough to make decisions involving the risks of alcohol consumption. However, assuming people mature at different rates, the specific age of 21 would be too late for some and too early for others. In this case, the somewhat arbitrary delineation is used because it is impossible or impractical to tell whether an individual is sufficiently mature for society to trust them with that kind of responsibility. Some proposed changes, however, have included the completion of an alcohol education course rather than the attainment of 21 years of age as the criterion for legal alcohol possession. This would put youth alcohol policy more on a case-by-case basis and less on a heuristic one, since the completion of such a course would presumably be voluntary and not uniform across the population.

The same reasoning applies to patent law. Patents are justified on the grounds that inventors must be protected so they have incentive to invent. It is therefore argued that it is in society's best interest that inventors receive a temporary government-granted monopoly on their idea, so that they can recoup investment costs and make economic profit for a limited period. In the United States, the length of this temporary monopoly is 20 years from the date the patent application was filed, though the monopoly does not actually begin until the application has matured into a patent. However, like the drinking age problem above, the specific length of time would need to be different for every product to be efficient. A 20-year term is used because it is difficult to tell what the number should be for any individual patent. More recently, some, including University of North Dakota law professor Eric E. Johnson, have argued that patents in different kinds of industries – such as software patents – should be protected for different lengths of time.[25]

Stereotyping

Stereotyping is a type of heuristic that people use to form opinions or make judgements about things they have never seen or experienced.[26] They work as a mental shortcut to assess everything from the social status of a person (based on their actions),[2] to classifying a plant as a tree based on it being tall, having a trunk, and that it has leaves (even though the person making the evaluation might never have seen that particular type of tree before).

Stereotypes, as first described by journalist Walter Lippmann in his book Public Opinion (1922), are the pictures we have in our heads that are built around experiences as well as what we are told about the world.[27][28]

Artificial intelligence

A heuristic can be used in artificial intelligence systems while searching a solution space. The heuristic is derived by using some function that is put into the system by the designer, or by adjusting the weight of branches based on how likely each branch is to lead to a goal node.

Behavioural economics

Heuristics refers to the cognitive shortcuts that individuals use to simplify decision-making processes in economic situations. Behavioral economics is a field that integrates insights from psychology and economics to better understand how people make decisions.

Anchoring and adjustment is one of the most extensively researched heuristics in behavioural economics. Anchoring is the tendency of people to make future judgements or conclusions based too heavily on the original information supplied to them. This initial knowledge functions as an anchor, and it can influence future judgements even if the anchor is entirely unrelated to the decisions at hand. Adjustment, on the other hand, is the process through which individuals make gradual changes to their initial judgements or conclusions.

Anchoring and adjustment has been observed in a wide range of decision-making contexts, including financial decision-making, consumer behavior, and negotiation. Researchers have identified a number of strategies that can be used to mitigate the effects of anchoring and adjustment, including providing multiple anchors, encouraging individuals to generate alternative anchors, and providing cognitive prompts to encourage more deliberative decision-making.

Other heuristics studied in behavioral economics include the representativeness heuristic, which refers to the tendency of individuals to categorize objects or events based on how similar they are to typical examples,[29] and the availability heuristic, which refers to the tendency of individuals to judge the likelihood of an event based on how easily it comes to mind.[30]

Types

Availability heuristic

According to Tversky and Kahneman (1973), the availability heuristic can be described as the tendency to consider events that they can remember with greater facilitation as more likely to occur than events that are more difficult to recall.[31] An example of this would be asking someone whether they believe they are more likely to get bitten by a shark attack or die in a drowning incident. Someone may quickly answer with the incorrect belief that they are more likely to die from a shark attack as the event is more easily remembered, and is often covered more heavily than drowning deaths in the news. The correct answer is that people are more likely to die of drowning (1 in 1,134) than die after being bitten by a shark (1 in 4,332,817).[32]

Representative heuristic

The representativeness heuristic refers to the cognitive bias where people rely on their preconceived mental image/prototype of a particular category or concept rather than actual probabilities and statistical data for making judgments. This behavior often leads to stereotyping/generalization with limited information causing errors as well as distorted views about reality.[33]

For instance, when trying to guess someone's occupation based on their appearance, a representative heuristic might be used by assuming that an individual in a suit must be either a lawyer or businessperson while assuming that someone in uniform fits the police officer or soldier category. This shortcut could sometimes be useful but may also result in stereotypes and overgeneralizations.

See also

References

  1. Myers, David G. (2010). Social psychology (Tenth ed.). New York, NY: McGraw-Hill. p. 94. ISBN 978-0-07337-066-8. OCLC 667213323. 
  2. 2.0 2.1 "Heuristics—Explanation and examples". https://conceptually.org/concepts/heuristics. 
  3. Pearl, Judea (1983). Heuristics: Intelligent Search Strategies for Computer Problem Solving. New York, NY: Addison-Wesley. p. vii. ISBN 978-0-201-05594-8. 
  4. Emiliano, Ippoliti (2015). Heuristic Reasoning: Studies in Applied Philosophy, Epistemology and Rational Ethics. Switzerland: Springer International Publishing. pp. 1–2. ISBN 978-3-319-09159-4. https://drive.google.com/file/d/0B_q6VhhkczIYcC1nWEV2ejZfOGs/view?usp=sharing. Retrieved 2015-11-24. 
  5. Sunstein, Cass (2005). "Moral Heuristics". The Behavioral and Brain Sciences 28 (4): 531–542. doi:10.1017/S0140525X05000099. PMID 16209802. 
  6. Pólya, George (1945) How to Solve It: A New Aspect of Mathematical Method, Princeton, NJ: Princeton University Press. ISBN:0-691-02356-5 ISBN:0-691-08097-6
  7. Gigerenzer, Gerd (1991). "How to Make Cognitive Illusions Disappear: Beyond "Heuristics and Biases"". European Review of Social Psychology 2: 83–115. doi:10.1080/14792779143000033. https://library.mpib-berlin.mpg.de/ft/gg/gg_how_1991.pdf. Retrieved 14 October 2012. 
  8. Kahneman, Daniel; Slovic, Paul; Tversky, Amos, eds (30 April 1982). Judgment Under Uncertainty. Cambridge, UK: Cambridge University Press. doi:10.1017/cbo9780511809477. ISBN 978-0-52128-414-1. 
  9. Heuristics and heuristic evaluation. http://www.interaction-design.org/encyclopedia/heuristics_and_heuristic_evaluation.html. Retrieved 1 September 2013. 
  10. Groner, Rudolf; Groner, Marina; Bischof, Walter F. (1983). Methods of Heuristics. Hillsdale, NJ: Lawrence Erlbaum. 
  11. Groner, Rudolf; Groner, Marina (1991). "Heuristische versus algorithmische Orientierung als Dimension des individuellen kognitiven Stils" (in de). Über die richtige Art, Psychologie zu betreiben. Göttingen: Hogrefe. ISBN 978-3-80170-415-5. 
  12. Gigerenzer, Gerd; Todd, Peter M.; and the ABC Research Group (1999). Simple Heuristics That Make Us Smart. Oxford, UK: Oxford University Press. ISBN 978-0-19512-156-8. 
  13. Bounded Rationality: The Adaptive Toolbox. Cambridge, MA: MIT Press. 2002. ISBN 978-0-26257-164-7. 
  14. Gigerenzer, Gerd; Hertwig, Ralph; Pachur, Thorsten (15 April 2011). Heuristics: The Foundations of Adaptive Behavior. Oxford University Press. doi:10.1093/acprof:oso/9780199744282.001.0001. ISBN 978-0-19989-472-7. 
  15. Gigerenzer, Gerd; Gaissmaier, Wolfgang (January 2011). "Heuristic Decision Making". Annual Review of Psychology 62: 451–482. doi:10.1146/annurev-psych-120709-145346. PMID 21126183. 
  16. De Neys, Wim (18 October 2008). "Cognitive experiential self theory". Perspectives on Psychological Science 7 (1): 28–38. doi:10.1177/1745691611429354. PMID 26168420. http://www.psych-it.com.au/Psychlopedia/article.asp?id=53. 
  17. Epstein, S.; Pacini, R.; Denes-Raj, V.; Heier, H. (1996). "Individual differences in intuitive-experiential and analytical-rational thinking styles". Journal of Personality and Social Psychology 71 (2): 390–405. doi:10.1037/0022-3514.71.2.390. PMID 8765488. 
  18. 18.0 18.1 Kahneman, Daniel; Frederick, Shane (2002). "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge, UK: Cambridge University Press. pp. 49–81. ISBN 978-0-52179-679-8. OCLC 47364085. https://archive.org/details/heuristicsbiases00gilo. 
  19. Kahneman, Daniel (December 2003). "Maps of Bounded Rationality: Psychology for Behavioral Economics". American Economic Review 93 (5): 1449–1475. doi:10.1257/000282803322655392. ISSN 0002-8282. http://www.econ.tuwien.ac.at/Lotto/papers/Kahneman2.pdf. 
  20. Cioffi, Jane (1997). "Heuristics, servants to intuition, in clinical decision making". Journal of Advanced Nursing 26 (1): 203–208. doi:10.1046/j.1365-2648.1997.1997026203.x. PMID 9231296. 
  21. Jaszczolt, K. M. (2006). "Defaults in Semantics and Pragmatics". Stanford Encyclopedia of Philosophy. ISSN 1095-5054. https://plato.stanford.edu/entries/defaults-semantics-pragmatics/. Retrieved 2021-06-08. 
  22. Frigg, Roman; Hartmann, Stephan (2006). "Models in Science". Stanford Encyclopedia of Philosophy. ISSN 1095-5054. https://plato.stanford.edu/entries/models-science/. Retrieved 2021-06-08. 
  23. Kiss, Olga (2006). "Heuristic, Methodology or Logic of Discovery? Lakatos on Patterns of Thinking". Perspectives on Science 14 (3): 302–317. doi:10.1162/posc.2006.14.3.302. 
  24. Heuristics and the Law. Cambridge, MA: MIT Press. 2007. ISBN 978-0-262-07275-5. 
  25. Johnson, Eric E. (2006). "Calibrating Patent Lifetimes". Santa Clara Computer & High Technology Law Journal 22: 269–314. http://www.eejlaw.com/writings/Johnson_Calibrating_Patent_Lifetimes.pdf. 
  26. Bodenhausen, Galen V. (1999). "On the Dialectics of Discrimination: Dual Processes in Social Stereotyping". in Chaiken, Shelly; Trope, Yaacov. Dual-process Theories in Social Psychology. New York, NY: Guilford Press. pp. 271–292. ISBN 978-1-57230-421-5. https://books.google.com/books?id=5X_auIBx99EC. 
  27. Kleg, Milton (1993). Hate Prejudice and Racism. Albany, NY: State University of New York Press. p. 135. ISBN 978-0-79141-536-8. https://books.google.com/books?id=aaMuQHO04I0C&pg=PA135. Retrieved 2015-03-24. 
  28. Gökçen, Sinan (20 November 2007). "Pictures in Our Heads". http://www.errc.org/cikk.php?cikk=2868. 
  29. Bhatia, Sudeep (2015). "Conceptualizing and studying linguistic representations across multiple levels of analysis: The case of L2 processing research". Cognitive Science 39: 122–148. https://www.sas.upenn.edu/~bhatiasu/Bhatia%202015%20CogSci%20PP.pdf. Retrieved 2023-04-20. 
  30. Dale, Sarah (2015). "Heuristics and biases: The science of decision-making". Business Information Review 32 (2): 93–99. doi:10.1177/0266382115592536. 
  31. Tversky, Amos; Kahneman, Daniel (1973-09-01). "Availability: A heuristic for judging frequency and probability" (in en). Cognitive Psychology 5 (2): 207–232. doi:10.1016/0010-0285(73)90033-9. ISSN 0010-0285. https://dx.doi.org/10.1016/0010-0285%2873%2990033-9. Retrieved 2023-08-24. 
  32. "Shark Attack Statistics - Frequency & Fatality Worldwide" (in en-US). 2023-02-02. https://worldanimalfoundation.org/advocate/shark-attack-statistics/. 
  33. Kahneman, Daniel; Tversky, Amos (July 1973). "On the psychology of prediction." (in en). Psychological Review 80 (4): 237–251. doi:10.1037/h0034747. ISSN 1939-1471. http://doi.apa.org/getdoi.cfm?doi=10.1037/h0034747. Retrieved 2023-05-09. 

Further reading