Philosophy:Cognitive miser
In psychology, the human mind is considered to be a cognitive miser due to the tendency of people to think and solve problems in simpler and less effortful ways rather than in more sophisticated and more effortful ways, regardless of intelligence.[1] Just as a miser seeks to avoid spending money, the human mind often seeks to avoid spending cognitive effort. The cognitive miser theory is an umbrella theory of cognition that brings together previous research on heuristics and attributional biases to explain how and why people are cognitive misers.[2][3]
The term cognitive miser was first introduced by Susan Fiske and Shelley Taylor in 1984. It is an important concept in social cognition theory and has been influential in other social sciences including but not exclusive to economics and political science.[2]
"People are limited in their capacity to process information, so they take shortcuts whenever they can."[2]
Assumption
The metaphor of the cognitive miser assumes that the human mind is rather limited in time, knowledge, attention, and cognitive resources.[4] Usually people do not think rationally or cautiously, but use cognitive shortcuts to make inferences and form judgments.[5][6] These shortcuts include the use of schemas, scripts, stereotypes, and other simplified perceptual strategies instead of careful thinking. For example, people tend to make correspondent reasoning and are likely to believe that behaviors should be correlated to or representative of stable characteristics.[7]
Background
The naïve scientist and attribution theory
Before Fiske and Taylor's cognitive miser theory, the predominant model of social cognition was the naïve scientist. First proposed in 1958 by Fritz Heider in The Psychology of Interpersonal Relations, this theory holds that humans think and act with dispassionate rationality whilst engaging in detailed and nuanced thought processes for both complex and routine actions.[8] In this way, humans were thought to think like scientists, albeit naïve ones, measuring and analyzing the world around them. Applying this framework to human thought processes, naïve scientists seek the consistency and stability that comes from a coherent view of the world and need for environmental control.[9][page needed]
In order to meet these needs, naïve scientists make attributions.[10][page needed] Thus, attribution theory emerged from the study of the ways in which individuals assess causal relationships and mechanisms.[11] Through the study of causal attributions, led by Harold Kelley and Bernard Weiner amongst others, social psychologists began to observe that subjects regularly demonstrate several attributional biases including but not limited to the fundamental attribution error.[12]
The study of attributions had two effects: it created further interest in testing the naive scientist and opened up a new wave of social psychology research that questioned its explanatory power. This second effect helped to lay the foundation for Fiske and Taylor's cognitive miser.[9][page needed]
Stereotypes
According to Walter Lippmann's arguments in his classic book Public Opinions,[13] people are not equipped to deal with complexity. Attempting to observe things freshly and in detail is mentally exhausting, especially among busy affairs. The term stereotype is thus introduced: people have to reconstruct the complex situation on a simpler model before they can cope with it, and the simpler model can be regarded as stereotype. Stereotypes are formed from the outside sources which identified with people's own interests and can be reinforced since people could be impressed by those facts that fit their philosophy.
On the other hand, in Lippmann's view, people are told about the world before they see it.[13] People's behavior is not based on direct and certain knowledge, but pictures made or given to them. Hence, influence from external factors are unneglectable in shaping people’s stereotypes. “The subtlest and most pervasive of all influences are those which create and maintain the repertory of stereotypes.”[13] That is to say, people live in a second-handed world with mediated reality, where the simplified model for thinking (i.e. stereotypes) could be created and maintained by external forces. Lippmann therefore suggested that the public"cannot be wise", since they can be easily misled by overly simplified reality which is consistent with their pre-existing pictures in mind, and any disturbance of the existing stereotypes will seem like "an attack upon the foundation of the universe".[13]
Although Lippmann did not directly define the term cognitive miser, stereotypes have important functions in simplifying people's thinking process. As cognitive simplification, it is useful for realistic economic management, otherwise people will be overwhelmed by the complexity of the real rationales. Stereotype, as a phenomenon, has become a standard topic in sociology and social psychology.[14]
Heuristics
Much of the cognitive miser theory is built upon work done on heuristics in judgment and decision-making,[15][page needed] most notably Amos Tversky and Daniel Kahneman results published in a series of influential articles.[16][17][18] Heuristics can be defined as the "judgmental shortcuts that generally get us where we need to go—and quickly—but at the cost of occasionally sending us off course."[19] In their work, Kahneman and Tversky demonstrated that people rely upon different types of heuristics or mental short cuts in order to save time and mental energy.[18] However, in relying upon heuristics instead of detailed analysis, like the information processing employed by Heider's naïve scientist, biased information processing is more likely to occur.[9][page needed] Some of these heuristics include:
- representativeness heuristic (the inclination to assign specific attributes to an individual the more he/she matches the prototype of that group).[16]
- availability heuristic (the inclination to judge the likelihood of something occurring because of the ease of thinking of examples of that event occurring)[9][page needed][16]
- anchoring and adjustment heuristic (the inclination to overweight the importance and influence of an initial piece of information, and then adjusting one's answer away from this anchor).[18]
The frequency with which Kahneman and Tversky and other attribution researchers found the individuals employed mental shortcuts to make decisions and assessments laid important groundwork for the overarching idea that individuals and their minds act efficiently instead of analytically.[15][page needed]
The cognitive miser theory
The wave of research on attributional biases done by Kahneman, Tversky and others effectively ended the dominance of Heider's naïve scientist within social psychology.[15] Fiske and Taylor, building upon the prevalence of heuristics in human cognition, offered their theory of the cognitive miser. It is, in many ways, a unifying theory which suggests that humans engage in economically prudent thought processes, instead of acting like scientists who rationally weigh costs and benefits, test hypothesis, and update expectations based upon the results of the experiments that are our everyday actions.[2] In other words, humans are more inclined to act as cognitive misers using mental short cuts to make assessments and decisions, about issues and ideas about which they know very little as well as issues of great salience. Fiske and Taylor argue that acting as cognitive misers is rational due to the sheer volume and intensity of information and stimuli humans intake[2][20] Given the limited information processing capabilities of individuals, people are always trying to adopt strategies that simplify complex problems. Cognitive misers usually act in two ways: by ignoring part of the information to reduce their own cognitive load, or by overusing some kind of information to avoid finding more information.
However, other psychologists also argue that the cognitively miserly tendency of humans is a primary reason why "humans are often less than rational".[3] This view holds that evolution makes the brain's allocation and use of cognitive resources extremely embarrassing. The basic principle is to save mental energy as much as possible, even when it is required to "use your head".[21] Unless the cognitive environment meets certain requirements, we will try to avoid thinking as much as possible.
Implications
The implications of this theory raise important questions about both cognition and human behavior. In addition to streamlining cognition in complicated, analytical tasks, cognitive misers are also at work when dealing with unfamiliar issues as well as issues of great importance.[2][20]
Politics
Voting behavior in democracies are an arena in which the cognitive miser is at work. Acting as a cognitive miser should lead those with expertise in an area to more efficient information processing and streamlined decision making.[22] However, as Lau and Redlawsk note, acting as cognitive miser who employs heuristics can have very different results for high-information and low-information voters. They write, "...cognitive heuristics are at times employed by almost all voters, and that they are particularly likely to be used when the choice situation facing voters is complex... heuristic use generally increases the probability of a correct vote by political experts but decreases the probability of a correct vote by novices."[22] In democracies, where no vote is weighted more or less because of the expertise behind its casting, low-information voters, acting as cognitive misers, can have broad and potentially deleterious choices for a society.[22]
Economics
Cognitive misers could also be one of the contributors to the prisoner's dilemma in gaming theory. To save cognitive energy, cognitive misers tend to assume that other people are similar to themselves. That is, habitual cooperators assume most of the others as cooperators, and habitual defectors assume most of the others as defectors.
Since cooperators offer to play more often, and fellow cooperators will also more often accept their offer, the researchers arrived at the consensus that cooperators would have a higher expected payoff compared with defectors when certain boundary conditions are met.[23]
Mass Communication
Lack of public support towards emerging techniques are commonly attributed to lack of relevant information and the low scientific literacy among the public. Known as the knowledge deficit model, this point of view is based on idealistic assumptions that education for science literacy could increase public support of science, and the focus of science communication should be increasing scientific understanding among lay public.[24][25] However, the relationship between information and attitudes towards scientific issues are not empirically supported.[26][27]
Based on the assumption that human beings are cognitive misers and tend to minimize the cognitive costs, the concept of low-information rationality is introduced as an empirically grounded alternative in explaining decision making and attitude formation. Instead of in-depth understanding of scientific topics, people make decisions based on other shortcuts or heuristics such as ideological predistortions or cues from mass media, and therefore use only as much information as necessary.[28][29] The less expertise citizens have on an issue initially, the more likely they will rely on these shortcuts.[29]
The cognitive miser theory thus has an implication for persuading: attitude formation is a competition between people's value systems and prepositions (or their own interpretive schemata) on a certain issue, and how public discourses frame it.[29] Framing theory suggest that the same topic will result in different interpretations among audience, if the information is presented in different ways.[30] Audiences' attitude change is closely connected with relabeling or re-framing the certain issue. In this sense, effective communication can be achieved if media provide audiences with cognitive shortcuts or heuristics that are resonate with underlying audience schemata.
Risk Assessment
The metaphor of cognitive misers could assist people in drawing lessons from risks, which is the possibility that an undesirable state of reality may occur.[31] People apply a number of shortcuts or heuristics in making judgements about the likelihood of an event, because the rapid answers provided by heuristics are often right.[2][32] Yet certain pitfalls may be neglected in these shortcuts. A practical example of cognitive misers' way of thinking in risk assessment of Deepwater Horizon explosion, is presented below.[33]
- People have trouble in imagining how small failings can pile up to form a catastrophe;
- People tend to get accustomed to risk. Due to the seemingly smooth current situation, people unconsciously adjusted their acceptance of risk;
- People tend to over-express their faith and confidence to backup systems and safety devices;
- People regard complicated technical systems in line with complicated governing structures;
- If concerned with the certain issue, people tend to spread good news and hide bad news;
- People tend to think alike if they are in the same field (see also: echo chamber), regardless of whether they are supervising this project or not.
Psychology
The theory that human beings are cognitive misers, also shed light on the dual process theory in psychology. Dual process theory propose that there are two types of cognitive processes in human mind. Daniel Kahneman provided his insight that these two processes could be named as intuitive (or System 1) and reasoning (or System 2) respectively.[34]
When processing with System 1 which start automatically without control, people pay little or even no effort but can generate surprisingly complex patterns of ideas. When processing with System 2, people allocate attention to effortful mental activities required, and can construct thoughts in an orderly series of steps.[35] These two cognitive processing systems are not separate and can have interactions with each other. Here is an example of how people's belief are formed under the dual process model in several steps:
- System 1 generates suggestions for System 2, with impressions, intuitions, intentions or feelings;
- If System 1's proposal is endorsed by System 2, those impressions and intuitions will turn into beliefs, and the sudden inspiration generated by System 1 will turn into voluntary actions;
- When everything goes smoothly (as is often the case), System 2 adopts the suggestions of System 1 with little or no modification. As a result, one will generally believe one's impressions and act on one's desires.
However, it does not necessarily mean that cognitive misers process little on System 2. The "reasoning" process can be activated to help with the intuitions when:
- A question arises, but System 1 does not generate an answer;
- An event is detected to violate the model of world that System 1 maintains.
Conflicts also exists in this dual-process. A brief example provided by Kahneman is that when we try not to stare at the oddly dressed couple at the neighboring table in a restaurant, our automatic reaction (System 1) makes us stare at them, but conflicts emerge as System 2 of processing tries to control this behavior.[35]
The dual processing system can produce cognitive illusions. System 1 always operates automatically, with our easiest shortcut but probably with error. System 2 may also have no clue to the error. Errors can be prevented only by enhanced monitoring of System 2, which costs a plethora of cognitive efforts.[35]
Limitations
Omission of motivation
The cognitive miser's theory, though explained cognitive processes that people tend to go through when making decisions, provide few hints on the role of motivation.[36] In Fiske's subsequent researches, the omission of the role of "intent" in the metaphor of cognitive miser is recognized. Motivation does affect the activation and use of stereotypes and prejudices.[37]
Updates and later research
Motivated tactician
As mentioned above, people tend to use heuristic shortcuts when making decisions. But the problem remains that although these shortcuts could not compare to effortful thoughts in accuracy, people should have a certain parameter to help them adopt one of the most adequate shortcuts.[38] Kruglanski proposed that people are combination of naïve scientists and cognitive misers: people are flexible social thinkers who choose between multiple cognitive strategies (i.e. speed/ease vs. accuracy/logic) based on their current goals, motives, and needs.[38]
Later models suggest that the cognitive miser and the naïve scientist create two poles of social cognition that are too monolithic. Instead, Fiske, Taylor, and Arie W. Kruglanski and other social psychologists offer an alternative explanation of social cognition: the motivated tactician.[2] According to this theory, people employ either shortcuts or thoughtful analysis based upon the context and salience of a particular issue. In other words, this theory suggests that humans are, in fact, both naive scientists and cognitive misers.[9][page needed] In this sense people are strategic instead of passively choosing the most effortless shortcuts when they allocate their cognitive efforts, and therefore they can decide to be naïve scientists or cognitive misers depending on their goals.
Meaning seeker
The meaning seeker theory reject both metaphors of human cognitive behaviors of cognitive miser and motivated tactician. Built within the framework of self-categorization, researchers believe that people employ categorical thinking to make sense of the social world. This kind of categorical thinking give meaning to social stimuli under adverse or difficult processing conditions.[39]
See also
- Bounded rationality
- Motivated reasoning
- Representativeness heuristic
References
- ↑ Stanovich, Keith E. (2009). "The cognitive miser: ways to avoid thinking". What intelligence tests miss: the psychology of rational thought. New Haven: Yale University Press. pp. 70–85. ISBN 9780300123852. OCLC 216936066. See also other chapters in the same book: "Framing and the cognitive miser" (chapter 7); "A different pitfall of the cognitive miser: thinking a lot, but losing" (chapter 9).
- ↑ 2.0 2.1 2.2 2.3 2.4 2.5 2.6 2.7 Fiske, Susan T.; Taylor, Shelley E. (1991). Social cognition (2nd ed.). New York: McGraw-Hill. ISBN 978-0070211919. OCLC 22810253. https://archive.org/details/socialcognition0002fisk.
- ↑ 3.0 3.1 Toplak, Maggie E.; West, Richard F.; Stanovich, Keith E. (April 2014). "Assessing miserly information processing: an expansion of the Cognitive Reflection Test". Thinking & Reasoning 20 (2): 147–168. doi:10.1080/13546783.2013.844729.
- ↑ Simon, H. A. (1956). Rational choice and the structure of the environment. Psychological review, 63(2), 129.
- ↑ Gilovich, Thomas. (2008). How we know what isn't so : the fallibility of human reason in everyday life. Free Press. OCLC 700511906.
- ↑ Nisbett, Richard E. (c. 1985). Human inference : strategies and shortcomings of social judgment. Prentice-Hall. ISBN 0134451309. OCLC 899043502. https://archive.org/details/humaninferencest0000nisb.
- ↑ Jones, E. E., & Davis, K. E. (1965). From acts to dispositions the attribution process in person perception. In Advances in experimental social psychology (Vol. 2, pp. 219-266). Academic Press.
- ↑ Heider, Fritz (1958). The psychology of interpersonal relations (1st ed.). New York: John Wiley & Sons. ISBN 978-0898592825. OCLC 225326.
- ↑ 9.0 9.1 9.2 9.3 9.4 Crisp, Richard J.; Turner, Rhiannon N. (2014). Essential social psychology (3rd ed.). New York: Sage Publications. ISBN 9781446270769. OCLC 873005953.
- ↑ Kassin, Saul; Fein, Steven; Markus, Hazel Rose (2016). Social psychology (10th ed.). Cengage Learning. ISBN 9781305580220. OCLC 952391832.
- ↑ Ross, Lee (1977). "The intuitive psychologist and his shortcomings: distortions in the attribution process". in Berkowitz, Leonard. Advances in experimental social psychology. 10. New York: Academic Press. pp. 173–220. ISBN 978-0120152100. OCLC 1283539.
- ↑ Jones, Edward E.; Harris, Victor A. (1967). "The attribution of attitudes". Journal of Experimental Social Psychology 3 (1): 1–24. doi:10.1016/0022-1031(67)90034-0.
- ↑ 13.0 13.1 13.2 13.3 Lippmann, W. (1965). Public Opinion. 1922. URL: http://infomotions. com/etexts/gutenberg/dirs/etext04/pbp nn10. htm.
- ↑ Jones, E. E., & Colman, A. M. (1996). Stereotypes. In A. Kuper and J. Kuper (Eds), The social science encyclopedia (2nd ed., pp. 843-844). London: Routledge
- ↑ 15.0 15.1 15.2 Barone, David F.; Maddux, James E.; Snyder, Charles R. (1997). Social cognitive psychology: history and current domains (1st ed.). New York: Plenum Press. ISBN 978-0306454752. OCLC 36330837.
- ↑ 16.0 16.1 16.2 Kahneman, Daniel; Tversky, Amos (1973). "On the psychology of prediction". Psychological Review 80 (4): 237–251. doi:10.1037/h0034747.
- ↑ Tversky, Amos; Kahneman, Daniel (1973). "Availability: a heuristic for judging frequency and probability". Cognitive Psychology 5 (2): 207–232. doi:10.1016/0010-0285(73)90033-9.
- ↑ 18.0 18.1 18.2 Tversky, Amos; Kahneman, Daniel (1974). "Judgment under uncertainty: heuristics and biases". Science 185 (4157): 1124–1131. doi:10.1126/science.185.4157.1124. PMID 17835457.
- ↑ Gilovich, Thomas; Savitsky, Kenneth (1996). "Like goes with like: the role of representativeness in erroneous and pseudoscientific beliefs". The Skeptical Inquirer 20 (2): 34–40. https://web.viu.ca/burnleyc/Psychology%20331/Like%20Goes%20with%20Like.pdf.
- ↑ 20.0 20.1 Scheufele, Dietram A.; Lewenstein, Bruce V. (17 May 2005). "The public and nanotechnology: how citizens make sense of emerging technologies". Journal of Nanoparticle Research 7 (6): 659–667 [660]. doi:10.1007/s11051-005-7526-2.
- ↑ Hull, David L. (2001). Science and selection : essays on biological evolution and the philosophy of science. Cambridge University Press. ISBN 0521643392. OCLC 876723188. https://archive.org/details/scienceselection0000hull.
- ↑ 22.0 22.1 22.2 Lau, Richard R.; David P. Redlawsk (4 Oct 2001). "Advantages and disadvantages of cognitive heuristics in political decision making". American Journal of Political Science 45 (4): 951–971. doi:10.2307/2669334.
- ↑ Orbell, John; Dawes, Robyn M. (June 1991). "A "Cognitive Miser" Theory of Cooperators Advantage". American Political Science Review 85 (2): 515–528. doi:10.2307/1963172. ISSN 0003-0554.
- ↑ Irwin, Alan; Wynne, Brian, eds (1996). Misunderstanding science?. Cambridge University Press. doi:10.1017/cbo9780511563737. ISBN 9780521432689.
- ↑ Marks, Nicola J (2016-11-15), "Public Understanding of Genetics: The Deficit Model", eLS, John Wiley & Sons, Ltd, pp. 1–5, doi:10.1002/9780470015902.a0005862.pub3, ISBN 9780470015902
- ↑ Kellstedt, Paul M.; Zahran, Sammy; Vedlitz, Arnold (February 2008). "Personal Efficacy, the Information Environment, and Attitudes Toward Global Warming and Climate Change in the United States". Risk Analysis 28 (1): 113–126. doi:10.1111/j.1539-6924.2008.01010.x. ISSN 0272-4332. PMID 18304110.
- ↑ Scheufele DA (2013) Communicating science in social settings. Proc Natl Acad Sci USA 110(Suppl 3):14040–14047
- ↑ Popkin, Samuel (1991). The Reasoning Voter. Chicago, IL: The University of Chicago Press. ISBN:0226675440.
- ↑ 29.0 29.1 29.2 Scheufele, D. A., & Turney, J. (2006). Messages and heuristics: How audiences form attitudes about emerging technologies. Engaging science: Thoughts, deeds, analysis and action.
- ↑ Scheufele, D. A., & Tewksbury, D. (2006). Framing, agenda setting, and priming: The evolution of three media effects models. Journal of communication, 57(1), 9-20.
- ↑ NRC, U. (1983). Risk assessment in the federal government: managing the process. National Research Council, Washington DC, 11, 3.
- ↑ Marteau, T. M (1999-01-01). "Communicating genetic risk information". British Medical Bulletin 55 (2): 414–428. doi:10.1258/0007142991902466. ISSN 0007-1420. PMID 10723866.
- ↑ Brooks, David (May 27, 2010). "Drilling for Certaint". New York Times. https://www.nytimes.com/2010/05/28/opinion/28brooks.html.
- ↑ Kahneman, D (2003). "A perspective on judgement and choice". American Psychologist. 58 (9): 697–720. CiteSeerX 10.1.1.186.3636. doi:10.1037/0003-066x.58.9.697. PMID 14584987
- ↑ 35.0 35.1 35.2 Kahneman, D. (2011). Thinking, fast and slow. Macmillan.
- ↑ Fiske, Susan T., author. (2017-03-15). Social cognition : from brains to culture. ISBN 978-1473969292. OCLC 968775128.
- ↑ Fiske, S. T. (2004). Intent and Ordinary Bias: Unintended Thought and Social Motivation Create Casual Prejudice. Social Justice Research, 17(2), 117–127. doi:10.1023/b:sore.0000027405.94966
- ↑ 38.0 38.1 Kruglanski, A. W. (1994). The social-cognitive bases of scientific knowledge. The social psychology of science, 197-213.
- ↑ Oakes, P. J., & Turner, J. C. (1990). Is limited information processing capacity the cause of social stereotyping?. European review of social psychology, 1(1), 111-135.
Further reading
- Barr, Nathaniel; Pennycook, Gordon; Stolz, Jennifer A.; Fugelsang, Jonathan A. (July 2015). "The brain in your pocket: evidence that smartphones are used to supplant thinking". Computers in Human Behavior 48: 473–480. doi:10.1016/j.chb.2015.02.029.
- De Neys, Wim; Rossi, Sandrine; Houdé, Olivier (April 2013). "Bats, balls, and substitution sensitivity: cognitive misers are no happy fools". Psychonomic Bulletin & Review 20 (2): 269–273. doi:10.3758/s13423-013-0384-5. PMID 23417270.
- Stanovich, Keith E. (2011). "The cognitive miser and focal bias". Rationality and the reflective mind. New York: Oxford University Press. pp. 65–71. doi:10.1093/acprof:oso/9780195341140.003.0004. ISBN 9780195341140. OCLC 648932780.
Original source: https://en.wikipedia.org/wiki/Cognitive miser.
Read more |