Philosophy:Implicit stereotype

From HandWiki
Revision as of 06:18, 5 February 2024 by MainAI5 (talk | contribs) (fix)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Unreflected, mistaken attributions to and descriptions of social groups

An implicit bias or implicit stereotype is the pre-reflective attribution of particular qualities by an individual to a member of some social out group.[1]

Implicit stereotypes are thought to be shaped by experience and based on learned associations between particular qualities and social categories, including race and/or gender.[2] Individuals' perceptions and behaviors can be influenced by the implicit stereotypes they hold, even if they are sometimes unaware they hold such stereotypes.[3] Implicit bias is an aspect of implicit social cognition: the phenomenon that perceptions, attitudes, and stereotypes can operate prior to conscious intention or endorsement.[4] The existence of implicit bias is supported by a variety of scientific articles in psychological literature.[5] Implicit stereotype was first defined by psychologists Mahzarin Banaji and Anthony Greenwald in 1995.

Explicit stereotypes, by contrast, are consciously endorsed, intentional, and sometimes controllable thoughts and beliefs.[6]

Implicit biases, however, are thought to be the product of associations learned through past experiences.[7] Implicit biases can be activated by the environment and operate prior to a person's intentional, conscious endorsement.[1] Implicit bias can persist even when an individual rejects the bias explicitly.[1]

Bias, attitude, stereotype and prejudice

Attitudes, stereotypes, prejudices, and bias are all examples of psychological constructs. Psychological constructs are mental associations that can influence a person's behavior and feelings toward an individual or group. If the person is unaware of these mental associations the stereotypes, prejudices, or bias is said to be implicit.

Bias is defined as prejudice in favor of or against one thing, person, or group compared with another, usually in a way considered to be unfair. Bias can be seen as the overarching definition of stereotype and prejudice, because it is how we associate traits (usually negative) to a specific group of people. Our “implicit attitudes reflect constant exposure to stereotypical portrayals of members of, and items in, all kinds of different categories: racial groups, professions, women, nationalities, members of the LGBTQ community, disabilities, moral and political values, etc.”[8]

An attitude is an evaluative judgment of an object, a person, or a social group.[9] An attitude is held by or characterizes a person. Implicit attitudes are evaluations that occur without conscious awareness towards an attitude object or the self.

A stereotype is the association of a person or a social group with a consistent set of traits. This may include both positive and negative traits, such as African Americans are great at sports or African Americans are more violent than any other race in the United States. There are many types of stereotypes that exists: racial, cultural, gender, group (i.e. college students), all being very explicit in the lives of many people.

Prejudice is defined as unfair negative attitude toward a social group or a member of that group.[10] Prejudices can stem from many of the things that people observe in a different social group that include, but are not limited to, gender, sex, race/ethnicity, or religion. This is pertinent to stereotypes because a stereotype can influence the way people feel toward another group, hence prejudice.

Methods for investigation

There is a clear challenge in measuring the degree to which someone is biased. There are two different forms of bias: implicit and explicit. The two forms of bias are, however, connected. “Explicit bias encompasses our conscious attitudes which can be measured by self-report, but pose the potential of individuals falsely endorsing more socially desirable attitudes. Although implicit biases have been considered unconscious and involuntary attitudes which lie below the surface of consciousness, some people seem to be aware of their influence on their behavior and cognitive processes.[11] The implicit-association test (IAT) is one validated tool used to measure implicit bias. The IAT requires participants to rapidly pair two social groups with either positive or negative attributes.”[12]

Implicit-association test

The implicit-association test (IAT) alleges to predict prejudice an individual has toward different social groups. The test claims to do this by capturing the differences in the time it takes respondent to choose between two unassociated but related topics. Respondents are instructed to click one of two computer keys to categorize stimuli into associated categories. When the categories appear consistent to the respondent, the time taken to categorize the stimuli will be less than when the categories seem inconsistent. An implicit association is said to exist when respondents take longer to respond to a category-inconsistent pairing than a category-consistent pairing. The implicit-association test is used in psychology for a wide array of topics. These fields include gender, race, science, career, weight, sexuality, and disability.[13] While acclaimed and highly influential, the implicit-association test falls short of a strong scientific consensus. Critics of the implicit-association test cite studies that counterintuitively link biased test scores with less discriminatory behavior.[14] Studies have also asserted that the implicit-association test fails to measure unconscious thought.[3]

Go/no-go association task (GNAT)

The GNAT is similar to the implicit-association test. Although the IAT reveals differential associations of two target concepts (e.g. male-female and weak-strong), the GNAT reveals associations within one concept (for example, whether female is associated more strongly with weak or strong).[15]

Participants are presented with word pairs among distractors. Participants are instructed to indicate "go" if the words are target pairs, or "no-go" if they are not. For example, participants may be instructed to indicate "go" if the word pairs are female names and words that are related to strength. Then, participants are instructed to indicate "go" if the word pairs are female names and words that are related to weakness. This method relies on signal detection theory; participants' accuracy rates reveal endorsement of the implicit stereotype. For example, if participants are more accurate for female-weak pairs than for female-strong pairs, this suggests the subject more strongly associates weakness with females than strength.[16]

Semantic priming and lexical decision task

Semantic priming measures the association between two concepts.[17] In a lexical decision task, subjects are presented with pair of words, and asked to indicate whether the pair are words (for example, "butter") or non-words (for example, "tubter"). The theory behind semantic priming is that subjects are quicker to respond to a word if preceded by a word related to it in meaning (e.g. bread-butter vs. bread-dog).[17] In other words, the word "bread" primes other words related in meaning, including butter. Psychologists utilize semantic priming to reveal implicit associations between stereotypic-congruent words. For instance, participants may be asked to indicate whether pronouns are male or female. These pronouns are either preceded by professions that are predominantly female ("secretary, nurse"), or male ("mechanic, doctor"). Reaction times reveal strength of association between professions and gender.[18]

Sentence completion

In a sentence completion task, subjects may be presented with sentences that contain stereotypic black and white names (Jerome, Adam), positive and negative stereotypic black behaviors (easily made the team, blasted loud music in his car) and counter-stereotypic behaviors (got a job at Microsoft, refused to dance). Subjects are asked to add to the end of a sentence in any way that is grammatical, e.g. "Jerome got an A on his test..." could be completed with "because it was easy" (stereotypic-congruent) or "because he studied for months" (stereotypic-incongruent) or "and then he went out to celebrate" (non-explanatory). This task is used to measure stereotypic explanatory bias (SEB): participants have a larger SEB if they give more explanations for stereotype-congruent sentences than stereotype–incongruent sentences, and if they give more stereotypic-congruent explanations.[19]

Differences between measures

The Implicit Association Test (IAT), sequential priming, and other implicit bias tests, are mechanisms for determining how susceptible we are to stereotypes. They are widely used in Social Psychology, although measuring response time to a question as a good measure of implicit biases is still up for debate. “Some theorists do question the interpretation of the scores from tests such as the IAT, but the debate is still going on and responses to the criticisms are certainly widespread.”[8]

In qualitative market research, researchers have described a framework called bias testing to mitigate researcher bias when designing survey questions. It involves empirically testing the survey questions with real-life respondents using interviewer moderated or technology-enabled unmoderated techniques.[20]

Findings

Gender bias

Gender biases are the stereotypical attitudes or prejudices that we have towards specific genders. "The concept of gender also refers to the constantly ongoing social construction of what is considered ‘feminine’ and ‘masculine’ and is based on power and sociocultural norms about women and men."[21] Gender biases are the ways in which we judge men and women based on their hegemonically feminine and masculine assigned traits.

The category of male has been found to be associated with traits of strength and achievement. Both male and female subjects associate male category members more strongly than female category members with words like bold, mighty, and power.[22] The strength of this association is not predicted by explicit beliefs, such as responses on a gender stereotype questionnaire (for example, one question asked if subjects endorsed the word feminist).[1] In a test to reveal the false fame effect, non famous male names are more likely to be falsely identified as famous than non famous female names; this is evidence for an implicit stereotype of male achievement.[23] Females are more associated with weakness. This is true for both male and female subjects, but female subjects only show this association when the weak words are positive, such as fine, flower and gentle; female subjects do not show this pattern when the weak words are negative, such as feeble, frail, and scrawny.[22]

Particular professions are implicitly associated with genders. Elementary school teachers are implicitly stereotyped to be female, and engineers are stereotyped to be male.[24]

Gender bias in science and engineering

Implicit-association tests reveal an implicit association for male with science and math, and females with arts and language.[25] Girls as young as nine years old have been found to hold an implicit male-math stereotype and an implicit preference for language over math.[26] Women have stronger negative associations with math than men do, and the stronger females associate with a female gender identity, the more implicit negativity they have towards math.[25] For both men and women, the strength of these implicit stereotypes predicts both implicit and explicit math attitudes, belief in one's math ability, and SAT performance.[25] The strength of these implicit stereotypes in elementary-aged girls predicts academic self-concepts, academic achievement, and enrollment preferences, even more than do explicit measures.[26] Women with a stronger implicit gender-math stereotype were less likely to pursue a math-related career, regardless of their actual math ability or explicit gender-math stereotypes.[27] This may be because women with stronger implicit gender-math stereotypes are more at risk for stereotype threat. Thus, women with strong implicit stereotypes perform much worse on a math test when primed with gender than women who have weak implicit stereotypes.[28]

Though the number of women pursuing and earning degrees in engineering has increased in the last 20 years, women are below men at all degree levels in all fields of engineering.[29] These implicit gender stereotypes are robust; in a study of more than 500,000 respondents from 34 nations, more than 70% of individuals held this implicit stereotype.[30] The national strength of the implicit stereotype is related to national sex differences among 8th graders on the International TIMSS, a worldwide math &science standardized achievement exam. This effect is present even after statistically controlling for gender inequality in general.[30] Additionally, for women across cultures, studies have shown individual differences in strength of this implicit stereotype is associated with interest, participation and performance in sciences.[30] Extending to the professional world, implicit biases and subsequent explicit attitudes toward women can "negatively affect the education, hiring, promotion, and retention of women in STEM".[31]

The effects of such implicit biases can be seen in across multiple studies including:

  • Parents rate the math abilities of their daughters lower than parents with sons who perform identically well in school[32]
  • College faculty are less likely to respond to inquiries about research opportunities if the email appears to be from a woman as opposed to an identical email from a man[33]
  • Science faculty are less likely to hire or mentor students they believe are women as opposed to men[34]

An interagency report from the Office of Science and Technology Policy and Office of Personnel Management has investigated systemic barriers including implicit biases that have traditionally inhibited particularly women and underrepresented minorities in science, technology, engineering, and mathematics (STEM) and makes recommendations for reducing the impact of bias.[35] Research has shown that implicit bias training may improve attitudes towards women in STEM.[31]

Racial bias

Racial bias can be used synonymously with "stereotyping and prejudice" because "it allows for the inclusion of both positive and negative evaluations related to perceptions of race."[36] We begin to create racial biases towards other groups of people starting as young as age 3, creating an ingroup and outgroup view on members of various races, usually starting with skin color.

In lexical decision tasks, after subjects are subliminally primed with the word BLACK, they are quicker to react to words consistent with black stereotypes, such as athletic, musical, poor and promiscuous. When subjects are subliminally primed with WHITE, they are quicker to react to white stereotypes, such as intelligent, ambitious, uptight and greedy.[37] These tendencies are sometimes, but not always, associated with explicit stereotypes.[37][38]

People may also hold an implicit stereotype that associates black category members as violent. People primed with words like ghetto, slavery and jazz were more likely to interpret a character in a vignette as hostile.[39] However, this finding is controversial; because the character's race was not specified, it is suggested that the procedure primed the race-unspecified concept of hostility, and did not necessarily represent stereotypes.[37]

An implicit stereotype of violent black men may associate black men with weapons. In a video game where subjects were supposed to shoot men with weapons and not shoot men with ordinary objects, subjects were more likely to shoot a black man with an ordinary object than a white man with an ordinary object. This tendency was related to subjects' implicit attitudes toward black people. Similar results were found in a priming task; subjects who saw a black face immediately before either a weapon or an ordinary object more quickly and accurately identified the image as a weapon than when it was preceded by a white face.[40]

Implicit race stereotypes affect behaviors and perceptions. When choosing between pairs of questions to ask a black interviewee, one of which is congruent with racial stereotype, people with a high stereotypic explanatory bias (SEB) are more likely to ask the racially congruent stereotype question. In a related study, subjects with a high SEB rated a black individual more negatively in an unstructured laboratory interaction.[19]

In-group and out-group bias

Group prototypes define social groups through a collection of attributes that define both what representative group members have in common and what distinguishes the ingroup from relevant outgroups.[41] In-group favoritism, sometimes known as in-group–out-group bias, in-group bias, or intergroup bias, is a pattern of favoring members of one's in-group over out-group members. This can be expressed in evaluation of others, in allocation of resources, and in many other ways.[42][43] Implicit in-group preferences emerge very early in life,[44] even in children as young as six years old. In-group bias wherein people who are ‘one of us’ (i.e., our ingroup) are favored compared to those in the outgroup, meaning those who differ from ourselves.[45] Ingroup favoritism is associated with feelings of trust and positive regard for ingroup members and surfaces often on measures of implicit bias. This categorization (ingroup vs. outgroup) is often automatic and pre-conscious.[46]

The reasons for having in-group and out-group bias could be explained by ethnocentrism, social categorization, oxytocin, etc. A research paper done by Carsten De Dreu reviewed that oxytocin enables the development of trust, specifically towards individuals with similar characteristics - categorized as ‘in-group’ members - promoting cooperation with and favoritism towards such individuals.[47] People who report that they have strong needs for simplifying their environments also show more ingroup favoritism.[48] The tendency to categorize into ingroups and outgroups and resulting ingroup favoritism is likely a universal aspect of human beings.[49]

We generally tend to hold implicit biases that favor our own ingroup, though research has shown that we can still hold implicit biases against our ingroup.[45][50] The most prominent example of negative affect towards an ingroup was recorded in 1939 by Kenneth and Mamie Clark using their now famous “Dolls Test”. In this test, African American children were asked to pick their favorite doll from a choice of otherwise identical black and white dolls. A high percentage of these African American children indicated a preference for the white dolls.[51] Social identity theory and Freudian theorists explain in-group derogation as the result of a negative self-image, which they believe is then extended to the group.[52]

Other stereotypes

Research on implicit stereotypes primarily focuses on gender and race. However, other topics, such as age, weight, and profession, have been investigated. IATs have revealed implicit stereotypes reflecting explicit stereotypes about adolescents. The results from these tests claim that adolescents are more likely to be associated with words like trendy and defiant than adults.[53] In addition, one IAT study revealed that older adults had a higher preference for younger adults compared to older adults; and younger adults had a lower implicit preference for younger adults compared to older adults. The study also found that women and participants with more education had lower implicit preference for younger adults.[54] IATs have also revealed implicit stereotypes on the relationship between obese individuals and low work performance. Words like lazy and incompetent are more associated with images of obese individuals than images of thin ones.[55] This association is stronger for thin subjects than overweight ones.[56] Like explicit stereotypes, implicit stereotypes may contain both positive and negative traits. This can be seen in examples of occupational implicit stereotypes where people perceive preschool teachers as both warm and incompetent, while lawyers are judged as both cold and competent.[57]

Activation of implicit stereotypes

Implicit stereotypes are activated by environmental and situational factors. These associations develop over the course of a lifetime beginning at a very early age through exposure to direct and indirect messages. In addition to early life experiences, the media and news programming are often-cited origins of implicit associations.[58] In the laboratory, implicit stereotypes are activated by priming. When subjects are primed with dependence by unscrambling words such as dependent, cooperative, and passive, they judge a target female as more dependent. When subjects are primed with aggression with words like aggressive, confident, argumentative, they judge a target male as more aggressive.[59] The fact that females and words such as dependent, cooperative, and passive and males and words like aggressive, confident, argumentative are thought to be associated together suggest an implicit gender stereotype. Stereotypes are also activated by a subliminal prime. To exemplify, white subjects exposed to subliminal words that consist of a black stereotype (ghetto, slavery, jazz) interpret a target male as more hostile, consistent with the implicit stereotype of hostile black man.[39] However, this finding is controversial because the character's race is not specified. Instead, it is suggested that the procedure primed the race-unspecified concept of hostility, and did not necessarily represent stereotypes.[37] By getting to know people who differ from you on a real, personal level, you can begin to build new associations about the groups those individuals represent and break down existing implicit associations.[60]

Malleability of implicit stereotypes

Implicit stereotypes can, at least temporarily, be reduced or increased. Most methods have been found to reduce implicit bias temporarily, and are largely based on context.[61] Some evidence suggests that implicit bias can be reduced long-term, but it may require education and consistent effort. Some implicit bias training techniques designed to counteract implicit bias are stereotype replacement, counter-stereotypic imaging, individuation, perspective taking, and increasing opportunities for contact.[62]

Stereotype replacement is when you replace a stereotypical response with a non-stereotypical response. Counter-stereotypic imagining is when you imagine others in a positive light and replace stereotypes with positive examples. Individuation is when you focus on specific details of a certain member of a group to avoid over-generalizing. Perspective taking is when you take the perspective of a member of a marginalized group. Increasing opportunities for contact is when you actively seek out opportunities to engage in interactions with members of marginalized groups.[62]

Self and social motives

The activation of implicit stereotypes may be decreased when the individual is motivated to promote a positive self-image, either to oneself or to others in a social setting. There are two parts to this: internal and external motivation. Internal motivation is when an individual wants to be careful of what they say, and external motivation is when an individual has a desire to respond in a politically correct way.[63]

Positive feedback from a black person decreases stereotypic sentence completion, while negative feedback from a black person increases it.[64] Subjects also reveal lesser strength of race stereotypes when they feel others disagree with the stereotypes.[65] Motivated self-regulation does not immediately reduce implicit bias. It raises awareness of discrepancies when biases stand in the way of personal beliefs.[63]

Promote counterstereotypes

Implicit stereotypes can be reduced by exposure to counterstereotypes. Reading biographies of females in leadership roles (such as Meg Whitman, the CEO of eBay) increases females’ associations between female names and words like leader, determined, and ambitious in a gender stereotype IAT.[66] Attending a women's college (where students are presumably more often exposed to women in leadership positions) reduces associations between leadership and males after one year of schooling.[66] Merely imagining a strong woman reduces implicit association between females and weakness, and imagining storybook princesses increases the implicit association between females and weakness.[16]

Focus of attention

Diverting a participant's focus of attention can reduce implicit stereotypes. Generally, female primes facilitate reaction time to stereotypical female traits when participants are instructed to indicate whether the prime is animate. When participants instead are instructed to indicate whether a white dot is present on the prime, this diverts their focus of attention from the primes’ feminine features. This successfully weakens the strength of the prime and thus weakening the strength of gender stereotypes.[67]

Configuration of stimulus cues

Whether stereotypes are activated depends on the context. When presented with an image of a Chinese woman, Chinese stereotypes were stronger after seeing her use chopsticks, and female stereotypes were stronger after seeing her put on makeup.[68]

Characteristics of individual category members

Stereotype activation may be stronger for some category members than for others. People express weaker gender stereotypes with unfamiliar than familiar names.[69] Judgments and gut reactions that go along with implicit biases are based on how familiar something is.[70]

Criticism

Some social psychology research has indicated that individuating information (giving someone any information about an individual group member other than category information) may eliminate the effects of stereotype bias.[71]

Meta-analyses

Researchers from the University of Wisconsin-Madison, Harvard, and the University of Virginia examined 426 studies over 20 years involving 72,063 participants that used the IAT and other similar tests. They concluded two things:

  1. The correlation between implicit bias and discriminatory behavior appears weaker than previously thought.
  2. There is little evidence that changes in implicit bias correlate with changes in a person's behavior.[72]

In a 2013 meta-analysis, Hart, Blanton, et al. declared that, despite its frequent misrepresentation as a proxy for the unconscious, "the IAT provides little insight into who will discriminate against whom, and provides no more insight than explicit measures of bias."[73]

News outlets

Heather Mac Donald, writing in The Wall Street Journal , noted that:

Few academic ideas have been as eagerly absorbed into public discourse lately as “implicit bias.” Embraced by Barack Obama, Hillary Clinton and most of the press, implicit bias has spawned a multimillion-dollar consulting industry, along with a movement to remove the concept of individual agency from the law. Yet its scientific basis is crumbling.

Mac Donald suggests there is still a political and economic drive to use the implicit bias paradigm as a political lever and to profit off entities that want to avoid litigation.[74]

Psychometric concerns

Edouard Machery has argued that “the use of [indirect measures like the implicit association test] is deeply problematic” because the tests do not exhibit the psychometric properties we would expect from measures of "attitudes".[75] However, many have already admitted that these indirect tests "assess behavior" rather than attitudes.[76] This is an example of how the debate about implicit bias can involve "talking past one another" based on "different expectations of indirect measures", views of (what) implicit bias (is), assumptions about which evidence is relevant, thresholds for scientific significance, psychometric standards, and even norms of science communication.[77] So evaluating debates about tests of implicit bias requires one to pay careful attention to debators' background assumptions and whether (or how well) debators' justify those assumptions.

Statement by original authors

Where previously Greenwald and Banaji asserted in their book BlindSpot (2013).

Given the relatively small proportion of people who are overtly prejudiced and how clearly it is established that automatic race preference predicts discrimination, it is reasonable to conclude not only that implicit bias is a cause of Black disadvantage but also that it plausibly plays a greater role than does explicit bias.[74]

The evidence presented by their peer researchers led them to concede in correspondence that:

  1. The IAT does not predict biased behaviour(in laboratory settings)
  2. It is "problematic to use [the IAT] to classify persons as likely to engage in discrimination".

However, they also stated, "Regardless of inclusion policy, both meta-analyses estimated aggregate correlational effect sizes that were large enough to explain discriminatory impacts that are societally significant either because they can affect many people simultaneously or because they can repeatedly affect single persons."[78]

Summary

Implicit bias is thought to be the product of positive or negative mental associations about persons, things, or groups that are formed and activated pre-consciously or subconsciously. In 1995, researchers Banaji and Greenwald noted that someone’s social learning experiences, such as observing parents, friends, or others, could create this type of association and, therefore, trigger this type of bias. Many studies have found that culture is able to stimulate biases as well, both in a negative and positive way regardless someone’s personal experience with other cultures.[79] As far as many people are concerned, implicit bias knows no age restriction and it can be held by anyone regardless of their age. In fact, implicit biases can be found in a person as young as six years old.[79] Even though implicit bias may be difficult to catch, especially compared to explicit bias, it can be measured through a number of mechanisms, such as sequential priming, response competition, EDA, EMG, fMRI, ERP and ITA.[80] Thus, once a person becomes aware of their own bias, they can take action to change it, if they wish.[81]

The existence of implicitly biased behavior is supported by several articles in psychological literature. Adults- and even children- may hold implicit stereotypes of social categories, categories to which they may themselves belong to. Without intention, or even awareness, implicit stereotypes affect human behavior and judgments. This has wide-ranging implications for society, from discrimination and personal career choices to understanding others in social interactions each day.[1][26][23][39][59]

See also


References

  1. 1.0 1.1 1.2 1.3 1.4 Greenwald, Anthony G.; Banaji, Mahzarin R. (1995). "Implicit social cognition: Attitudes, self-esteem, and stereotypes". Psychological Review 102 (1): 4–27. doi:10.1037/0033-295x.102.1.4. PMID 7878162. 
  2. Byrd, Nick (February 2021). "What we can (and can't) infer about implicit bias from debiasing experiments". Synthese 198 (2): 1427–1455. doi:10.1007/s11229-019-02128-6. 
  3. 3.0 3.1 Hahn, Adam; Judd, Charles M.; Hirsh, Holen K.; Blair, Irene V. (June 2014). "Awareness of implicit attitudes". Journal of Experimental Psychology: General 143 (3): 1369–1392. doi:10.1037/a0035028. PMID 24294868. 
  4. Gawronski, Bertram (June 10, 2019). "Six Lessons for a Cogent Science of Implicit Bias and Its Criticism". Perspectives on Psychological Science 14 (4): 574–595. doi:10.1177/1745691619826015. PMID 31181174. 
  5. Jost, John T.; Rudman, Laurie A.; Blair, Irene V.; Carney, Dana R.; Dasgupta, Nilanjana; Glaser, Jack; Hardin, Curtis D. (2009). "The existence of implicit bias is beyond reasonable doubt: A refutation of ideological and methodological objections and executive summary of ten studies that no manager should ignore". Research in Organizational Behavior 29: 39–69. doi:10.1016/j.riob.2009.10.001. 
  6. Gaertner, Brown, Sam, Rupert (April 15, 2008). Blackwell Handbook of Social Psychology: Intergroup Processes. John Wiley & Sons. ISBN 9780470692707. https://books.google.com/books?id=LNZHf3K4xzMC. Retrieved August 11, 2013. 
  7. Del Pinal, Guillermo; Spaulding, Shannon (February 2018). "Conceptual centrality and implicit bias". Mind & Language 33 (1): 95–111. doi:10.1111/mila.12166. https://philpapers.org/rec/GUICCA-4. 
  8. 8.0 8.1 Toribio, Josefa (March 1, 2018). "Implicit Bias: from social structure to representational format". Theoria 33 (1): 41. doi:10.1387/theoria.17751. 
  9. Crano, W.D., & Prislin, R. (2008). Attitudes and attitude change. New York: CRC Press.
  10. Gaertner, Samuel L.; Dovidio, John F. (1999). "Reducing Prejudice: Combating Intergroup Biases". Current Directions in Psychological Science 8 (4): 101–105. doi:10.1111/1467-8721.00024. 
  11. Hahn, Adam; Gawronski, Bertram (May 2019). "Facing one's implicit biases: From awareness to acknowledgment.". Journal of Personality and Social Psychology 116 (5): 769–794. doi:10.1037/pspi0000155. PMID 30359070. http://psyarxiv.com/6cqdk/download. 
  12. Maina, Ivy W.; Belton, Tanisha D.; Ginzberg, Sara; Singh, Ajit; Johnson, Tiffani J. (February 2018). "A decade of studying implicit racial/ethnic bias in healthcare providers using the implicit association test". Social Science & Medicine 199: 219–229. doi:10.1016/j.socscimed.2017.05.009. PMID 28532892. 
  13. Harris, Matthew; Macinko, James; Jimenez, Geronimo; Mullachery, Pricila (December 2017). "Measuring the bias against low-income country research: an Implicit Association Test". Globalization and Health 13 (1): 80. doi:10.1186/s12992-017-0304-y. PMID 29110668. 
  14. Shelton, J. N.; Richeson, J. A.; Salvatore, J.; Trawalter, S. (May 1, 2005). "Ironic Effects of Racial Bias During Interracial Interactions". Psychological Science 16 (5): 397–402. doi:10.1111/j.0956-7976.2005.01547.x. PMID 15869700. 
  15. Nosek, Brian A.; Banaji, Mahzarin R. (December 2001). "The Go/No-Go Association Task". Social Cognition 19 (6): 625–666. doi:10.1521/soco.19.6.625.20886. http://psyarxiv.com/4ed36//download. 
  16. 16.0 16.1 Blair, Irene V.; Ma, Jennifer E.; Lenton, Alison P. (2001). "Imagining stereotypes away: The moderation of implicit stereotypes through mental imagery.". Journal of Personality and Social Psychology 81 (5): 828–841. doi:10.1037/0022-3514.81.5.828. PMID 11708560. 
  17. 17.0 17.1 Meyer, David E.; Schvaneveldt, Roger W. (1971). "Facilitation in recognizing pairs of words: Evidence of a dependence between retrieval operations". Journal of Experimental Psychology 90 (2): 227–234. doi:10.1037/h0031564. PMID 5134329. 
  18. Banaji, Mahzarin R.; Hardin, Curtis D. (May 1996). "Automatic Stereotyping". Psychological Science 7 (3): 136–141. doi:10.1111/j.1467-9280.1996.tb00346.x. 
  19. 19.0 19.1 Sekaquaptewa, Denise; Espinoza, Penelope; Thompson, Mischa; Vargas, Patrick; von Hippel, William (January 2003). "Stereotypic explanatory bias: Implicit stereotyping as a predictor of discrimination". Journal of Experimental Social Psychology 39 (1): 75–82. doi:10.1016/S0022-1031(02)00512-7. 
  20. Geisen, Emily; Sha, Mandy; Roper, Farren (2024). Bias testing in market research: A framework to enable inclusive research design (published January 3, 2024). ISBN 979-8862902785. https://www.amazon.com/dp/B0CRGN8RNT. 
  21. Hamberg, Katrina (May 2008). "Gender bias in medicine". Women's Health 4 (3): 237–243. doi:10.2217/17455057.4.3.237. PMID 19072473. 
  22. 22.0 22.1 Rudman, L. A.; Greenwald, A. G.; McGhee, D. E. (2001). "Implicit self-concept and evaluative implicit gender stereotypes: Self and ingroup share desirable traits". Personality and Social Psychology Bulletin 27 (9): 1164–1178. doi:10.1177/0146167201279009. 
  23. 23.0 23.1 Banaji, M. R.; Greenwald, A. G. (1995). "Implicit gender stereotyping in judgments of fame". Journal of Personality and Social Psychology 68 (2): 181–198. doi:10.1037/0022-3514.68.2.181. PMID 7877095. 
  24. White, M. J.; White, G. B. (2006). "Implicit and explicit occupational gender stereotypes". Sex Roles 55 (3–4): 259–266. doi:10.1007/s11199-006-9078-z. 
  25. 25.0 25.1 25.2 Nosek, B. A.; Banaji, M. R.; Greenwald, A. G. (2002). "Math = male, me = female, therefore math ≠ me". Journal of Personality and Social Psychology 83 (1): 44–59. doi:10.1037/0022-3514.83.1.44. PMID 12088131. 
  26. 26.0 26.1 26.2 Steffens, M. C.; Jelenec, P.; Noack, P. (2010). "On the leaky math pipeline: Comparing implicit math-gender stereotypes and math withdrawal in female and male children and adolescents". Journal of Educational Psychology 102 (4): 947–963. doi:10.1037/a0019920. 
  27. Kiefer, Amy K.; Sekaquaptewa, Denise (January 2007). "Implicit Stereotypes, Gender Identification, and Math-Related Outcomes: A Prospective Study of Female College Students". Psychological Science 18 (1): 13–18. doi:10.1111/j.1467-9280.2007.01841.x. PMID 17362371. 
  28. Kiefer, A. K.; Sekaquaptewa, D. (2007). "Implicit stereotypes and women's math performance: How implicit gender-math stereotypes influence women's susceptibility to stereotype threat". Journal of Experimental Social Psychology 43 (5): 825–832. doi:10.1016/j.jesp.2006.08.004. 
  29. "Women, Minorities, and Persons with Disabilities in Science and Engineering". https://www.nsf.gov/statistics/2017/nsf17310/digest/about-this-report/. 
  30. 30.0 30.1 30.2 Nosek, B. A.; Smyth, F. L.; Sriram, N. N.; Lindner, N. M.; Devos, T.; Ayala, A.; Greenwald, A. G. (2009). "National differences in gender–science stereotypes predict national sex differences in science and math achievement". Proceedings of the National Academy of Sciences of the United States of America 106 (26): 10593–10597. doi:10.1073/pnas.0809921106. PMID 19549876. Bibcode2009PNAS..10610593N. 
  31. 31.0 31.1 Jackson, Sarah M.; Hillard, Amy L.; Schneider, Tamera R. (September 2014). "Using implicit bias training to improve attitudes toward women in STEM". Social Psychology of Education 17 (3): 419–438. doi:10.1007/s11218-014-9259-5. 
  32. Yee, Doris K.; Eccles, Jacquelynne S. (September 1988). "Parent perceptions and attributions for children's math achievement". Sex Roles 19 (5–6): 317–333. doi:10.1007/bf00289840. 
  33. Milkman, Katherine L.; Akinola, Modupe; Chugh, Dolly (2015). "What happens before? A field experiment exploring how pay and representation differentially shape bias on the pathway into organizations". Journal of Applied Psychology 100 (6): 1678–1712. doi:10.1037/apl0000022. PMID 25867167. https://repository.upenn.edu/cgi/viewcontent.cgi?article=1389&context=fnce_papers. 
  34. Moss-Racusin, C. A.; Dovidio, J. F.; Brescoll, V. L.; Graham, M. J.; Handelsman, J. (October 9, 2012). "Science faculty's subtle gender biases favor male students". Proceedings of the National Academy of Sciences 109 (41): 16474–16479. doi:10.1073/pnas.1211286109. PMID 22988126. Bibcode2012PNAS..10916474M. 
  35. Handelsman, Jo; Ward, Wanda (December 12, 2016). "Increasing Diversity in the STEM Workforce by Reducing the Impact of Bias". https://obamawhitehouse.archives.gov/blog/2016/12/12/increasing-diversity-stem-workforce-reducing-impact-bias. 
  36. Noles, Erica (May 1, 2014). What's age got to do with it? Examining how the age of stimulus faces affects children's implicit racial bias. UNLV Theses, Dissertations, Professional Papers, and Capstones (Thesis). University of Nevada, Las Vegas. doi:10.34917/5836145. ProQuest 1566943023.
  37. 37.0 37.1 37.2 37.3 Wittenbrink, B.; Judd, C. M.; Park, B. (1997). "Evidence for racial prejudice at the implicit level and its relationship with questionnaire measures". Journal of Personality and Social Psychology 72 (2): 262–274. doi:10.1037/0022-3514.72.2.262. PMID 9107001. 
  38. Gaertner, S. L.; McLaughlin, J. P. (1983). "Racial stereotypes: Associations and ascriptions of positive and negative characteristics". Social Psychology Quarterly 46 (1): 23–30. doi:10.2307/3033657. 
  39. 39.0 39.1 39.2 Devine, P. G. (1989). "Stereotypes and prejudice: Their automatic and controlled components". Journal of Personality and Social Psychology 56: 5–18. doi:10.1037/0022-3514.56.1.5. 
  40. Staats, Cheryl; Patton, Charles (2013). State of the Science: Implicit Bias Review. Kirwan Institute for the Study of Race and Ethnicity at the Ohio State University. pp. 3, 37–39. https://kirwaninstitute.osu.edu/sites/default/files/2019-06//SOTS-Implicit_Bias.pdf. 
  41. Hohman, Zachary P.; Gaffney, Amber M.; Hogg, Michael A. (September 2017). "Who Am I If I Am Not like My Group? Self-Uncertainty and Feeling Peripheral in a Group". Journal of Experimental Social Psychology 72: 125–132. doi:10.1016/j.jesp.2017.05.002. 
  42. Aronson, E., Wilson, T. D., & Akert, R. (2010). Social psychology. 7th ed. Upper Saddle River: Prentice Hall.[page needed]
  43. Taylor, Donald M.; Doria, Janet R. (April 1981). "Self-Serving and Group-Serving Bias in Attribution". The Journal of Social Psychology 113 (2): 201–211. doi:10.1080/00224545.1981.9924371. 
  44. Dunham, Y.; Baron, A. S.; Banaji, M. R. (2008). "The Development of Implicit Intergroup Cognition". Trends in Cognitive Sciences 12 (7): 248–253. doi:10.1016/j.tics.2008.04.006. PMID 18555736. https://dash.harvard.edu/bitstream/handle/1/2902705/Banaji_ImplicitIntergroup.pdf?sequence=2. 
  45. 45.0 45.1 Greenwald, A. G.; Krieger, L. H. (2006). "Implicit Bias: Scientific Foundations". California Law Review 94 (4): 945–967. doi:10.2307/20439056. https://scholarship.law.berkeley.edu/californialawreview/vol94/iss4/1. 
  46. Reskin, B (2000). "The Proximate Causes of Employment Discrimination". Contemporary Sociology 29 (2): 319–328. doi:10.2307/2654387. 
  47. De Dreu, Carsten K.W. (2012). "Oxytocin modulates cooperation within and competition between groups: An integrative review and research agenda". Hormones and Behavior 61 (3): 419–428. doi:10.1016/j.yhbeh.2011.12.009. PMID 22227278. 
  48. Strangor, Charles; Leary, Scott P. (2006). Intergroup beliefs: Investigations from the social side. 38. 243–281. doi:10.1016/S0065-2601(06)38005-7. ISBN 9780120152384. 
  49. Brewer, Marilynn B. (2002). "The psychology of prejudice: Ingroup love or outgroup hate?". Journal of Social Issues 55 (3): 429–444. doi:10.1111/0022-4537.00126. 
  50. Reskin, Barbara (2005). "Unconsciousness Raising". Regional Review, 2005 14 (3): 32–37. http://www.bostonfed.org/economic/nerr/rr2005/q1/section3a.pdf. 
  51. Clark, Kenneth B.; Clark, Mamie P. (1950). "Emotional Factors in Racial Identification and Preference in Negro Children". The Journal of Negro Education 19 (3): 341–350. doi:10.2307/2966491. 
  52. Ma-Kellams, Christine; Spencer-Rodgers, Julie; Peng, Kaiping (January 2011). "I Am Against Us? Unpacking Cultural Differences in Ingroup Favoritism via Dialecticism". Personality and Social Psychology Bulletin 37 (1): 15–27. doi:10.1177/0146167210388193. PMID 21084525. 
  53. Gross, E. F.; Hardin, C. D. (2007). "Implicit and explicit stereotyping of adolescents". Social Justice Research 20 (2): 140–160. doi:10.1007/s11211-007-0037-9. 
  54. Chopik, William J.; Giasson, Hannah L. (August 1, 2017). "Age Differences in Explicit and Implicit Age Attitudes Across the Life Span". The Gerontologist 57 (suppl_2): S169–S177. doi:10.1093/geront/gnx058. PMID 28854609. 
  55. Agerström, J.; Rooth, D. (2011). "The role of automatic obesity stereotypes in real hiring discrimination". Journal of Applied Psychology 96 (4): 790–805. doi:10.1037/a0021594. PMID 21280934. 
  56. Schwartz, M. B.; Vartanian, L. R.; Nosek, B. A.; Brownell, K. D. (2006). "The Influence of One's Own Body Weight on Implicit and Explicit Anti-fat Bias". Obesity 14 (3): 440–447. doi:10.1038/oby.2006.58. PMID 16648615. 
  57. Carlsson, Rickard; Björklund, Fredrick (2010). "Implicit stereotype content: mixed stereotypes can be measured with the implicit association test". Social Psychology 41 (4): 213–222. doi:10.1027/1864-9335/a000029. 
  58. "Understanding Implicit Bias" (in en-US). http://kirwaninstitute.osu.edu/research/understanding-implicit-bias/. 
  59. 59.0 59.1 Banaji, M. R.; Hardin, C.; Rothman, A. J. (1993). "Implicit stereotyping in person judgment". Journal of Personality and Social Psychology 65 (2): 272–281. doi:10.1037/0022-3514.65.2.272. 
  60. Cheryl Staats (Winter 2015–2016). "Understanding Implicit Bias" (in en). American Educator: pp. 29–43. https://www.aft.org/sites/default/files/ae_winter2015staats.pdf. 
  61. Calanchini, Jimmy; Lai, Calvin K.; Klauer, Karl Christoph (August 27, 2020). "Reducing implicit racial preferences: III. A process-level examination of changes in implicit preferences". Journal of Personality and Social Psychology 121 (4): 796–818. doi:10.1037/pspi0000339. PMID 32852973. http://psyarxiv.com/vuqa2/download. 
  62. 62.0 62.1 Devine, Patricia G.; Forscher, Patrick S.; Austin, Anthony J.; Cox, William T.L. (November 2012). "Long-term reduction in implicit race bias: A prejudice habit-breaking intervention". Journal of Experimental Social Psychology 48 (6): 1267–1278. doi:10.1016/j.jesp.2012.06.003. PMID 23524616. 
  63. 63.0 63.1 Burns, Mason D.; Monteith, Margo J.; Parker, Laura R. (November 2017). "Training away bias: The differential effects of counterstereotype training and self-regulation on stereotype activation and application". Journal of Experimental Social Psychology 73: 97–110. doi:10.1016/j.jesp.2017.06.003. 
  64. Sinclair, L.; Kunda, Z. (1999). "Reactions to a Black professional: Motivated inhibition and activation of conflicting stereotypes". Journal of Personality and Social Psychology 77 (5): 885–904. doi:10.1037/0022-3514.77.5.885. PMID 10573871. 
  65. Stangor, C.; Sechrist, G. B.; Jost, J. T. (2001). "Changing racial beliefs by providing consensus information". Personality and Social Psychology Bulletin 27 (4): 486–496. doi:10.1177/0146167201274009. 
  66. 66.0 66.1 Dasgupta, N.; Asgari, S. (2004). "Seeing is believing: Exposure to counterstereotypic women leaders and its effect on the malleability of automatic gender stereotyping". Journal of Experimental Social Psychology 40 (5): 642–658. doi:10.1016/j.jesp.2004.02.003. 
  67. Macrae, C.; Bodenhausen, G. V.; Milne, A. B.; Thorn, T. J.; Castelli, L. (1997). "On the activation of social stereotypes: The moderating role of processing objectives". Journal of Experimental Social Psychology 33 (5): 471–489. doi:10.1006/jesp.1997.1328. 
  68. Macrae, C.; Bodenhausen, G. V.; Milne, A. B. (1995). "The dissection of selection in person perception: Inhibitory processes in social stereotyping". Journal of Personality and Social Psychology 69 (3): 397–407. doi:10.1037/0022-3514.69.3.397. PMID 7562387. 
  69. Macrae, C.Neil; Mitchell, Jason P.; Pendry, Louise F. (March 2002). "What's in a Forename? Cue Familiarity and Stereotypical Thinking". Journal of Experimental Social Psychology 38 (2): 186–193. doi:10.1006/jesp.2001.1496. 
  70. Vierkant, Tillmann; Hardt, Rosa (April 2015). "Explicit Reasons, Implicit Stereotypes and the Effortful Control of the Mind". Ethical Theory and Moral Practice 18 (2): 251–265. doi:10.1007/s10677-015-9573-9. https://www.pure.ed.ac.uk/ws/files/189177238/VierkantHardtETMP2015ExplicitReasons.pdf. 
  71. Rubinstein, Rachel; Jussim, Lee (March 1, 2018). "Reliance on individuating information and stereotypes in implicit and explicit person perception". Journal of Experimental Social Psychology 75: 54–70. doi:10.1016/j.jesp.2017.11.009. https://psyarxiv.com/8dg25/. 
  72. Forscher, Patrick S.; Lai, Calvin K.; Axt, Jordan R.; Ebersole, Charles R.; Herman, Michelle; Devine, Patricia G.; Nosek, Brian A. (2019). "A Meta-Analysis of Procedures to Change Implicit Measures". Journal of Personality and Social Psychology 117 (3): 522–559. doi:10.1037/pspa0000160. PMID 31192631. 
  73. Oswald, Frederick; Mitchell, Gregory; Blanton, Hart; Jaccard, James; Tetlock, Philip (June 17, 2013). "Predicting Ethnic and Racial Discrimination: A Meta-Analysis of IAT Criterion Studies". Journal of Personality and Social Psychology 105 (2): 171–192. doi:10.1037/a0032734. PMID 23773046. 
  74. 74.0 74.1 Donald, Heather Mac (October 9, 2017). "The False 'Science' of Implicit Bias". The Wall Street Journal. https://www.wsj.com/articles/the-false-science-of-implicit-bias-1507590908. 
  75. Machery, Edouard (2021). "Anomalies in implicit attitudes research". WIREs Cognitive Science 13 (1): e1569. doi:10.1002/wcs.1569. PMID 34130361. https://doi.org/10.1002/wcs.1569. 
  76. Brownstein, Michael; Madva, Alex; Gawronski, Bertram (2019). "What do implicit measures measure?". WIREs Cognitive Science 10 (5): e1501. doi:10.1002/wcs.1501. PMID 31034161. https://doi.org/org/10.1002/wcs.1501. 
  77. Byrd, Nick; Thompson, Morgan (2022). "Testing for implicit bias: Values, psychometrics, and science communication". WIREs Cognitive Science 13 (5): e1612. doi:10.1002/wcs.1612. PMID 35671040. https://doi.org/10.1002/wcs.1612. 
  78. Greenwald, Anthony G.; Banaji, Mahzarin R.; Nosek, Brian A. (2015). "Statistically small effects of the Implicit Association Test can have societally large effects". Journal of Personality and Social Psychology 108 (4): 553–561. doi:10.1037/pspa0000016. PMID 25402677. http://psyarxiv.com/as2ez/download. 
  79. 79.0 79.1 Fazio; Jackson; Dunton; Williams. "Implicit Bias". http://www.ncsc.org/~/media/Files/PDF/Topics/Gender%20and%20Racial%20Fairness/Implicit%20Bias%20FAQs%20rev.ashx. 
  80. Gawronski, Bertram; Morrison, Mike; Phills, Curtis E.; Galdi, Silvia (March 2017). "Temporal Stability of Implicit and Explicit Measures: A Longitudinal Analysis". Personality and Social Psychology Bulletin 43 (3): 300–312. doi:10.1177/0146167216684131. PMID 28903689. 
  81. Gawronski, Bertram; Ledgerwood, Alison; Eastwick, Paul W. (October 2020). "Implicit Bias and Antidiscrimination Policy". Policy Insights from the Behavioral and Brain Sciences 7 (2): 99–106. doi:10.1177/2372732220939128. 

External links