Skip to main content

Implicit Bias


Providing high-quality education to ensure that everyone learns is an explicit process over which educators exert a great deal of control. Nevertheless, the process can be affected by an unconscious process, called implicit bias, that can work counter to educators’ intentions.

Implicit bias refers to the tendency to act on the basis of prejudice and stereotypes without consciously intending to do so (Brownstein & Edward, 2019). It affects everyone, even people who think of themselves as antiracist or committed to diversity in their communities and workplaces. Implicit bias can be challenging to confront, both because it occurs unconsciously and because it makes people question the possibility that working to combat racism will ever bring about desired results. But there are some things everyone can do to counteract their own implicit biases.

While people have long believed that implicit bias existed, we only recently began to generate large bodies of data about it. In 1998, a group of scientists created a tool called the Implicit Association Test (IAT) designed to measure implicit bias by asking users to make a series of associations of certain types of people with certain roles or attributes (Greenwald & Banaji, 1995). Users are timed, forcing them to make choices without much time to think. Since this tool became available, dozens of research studies have used it to show how implicit bias works and some situations in which it is common.

One thing the IAT reveals is that people make positive associations that closely match up with a racial hierarchy, “preferring” white people and white faces to people of color and associating white people with positive or power-holding attributes. Even more troubling, research has shown that this effect even appears when the test participants are themselves people of color (Ashburn-Nardo et al., 2003; Dasgupta, 2004; Nosek et al., 2002).

What are some examples of implicit bias at work in education?

Imagine a superintendent or principal considering the resumes of two prospective teachers. The two applicants have similar qualifications, but the names at the tops of the resumes—[white name] and [black name]—are very different. One study of prospective employers in situations like this showed that they preferred “white” sounding names even when the resumes of candidates were identical; another showed that people in science fields preferred men to women (Bertrand & Mullainathan, 2003; Moss-Racusin et al., 2012).

Next, imagine a teacher in charge of a classroom. This setting also carries numerous opportunities for implicit bias to operate. It can affect the way teachers grade assignments and even the general expectations teachers hold for their students. Another study showed that name associations affected teachers’ reactions to the student work they are evaluating (Harari & McDavid, 1973).

If implicit bias operates whether we choose it or not, even when we profess contradictory beliefs, what can we do about it? While we may not be able to eliminate implicit bias entirely, some studies have suggested helpful tools for intervening against it.

One technique, sometimes called “blinding,” involves trying to eliminate factors (like names) that allow implicit bias associations to operate. The superintendent in our prior example might do this by considering resumes and applications that have names and demographic information removed.

But blinding is not always possible, especially for teachers. Some research suggests that just being aware that one’s reactions are biased can help (Monteith, 1993; Monteith et al., 2002). A few studies suggest that carrying clear goals or “moods”—such as avoiding unfair associations or expectations—into a situation can help offset your implicit biases (Huntsinger et al., 2010; Mann & Kawakami, 2012; Moskowitz & Li, 2011). Another set of techniques involves making clear “if-then” plans for how to respond when opportunities for implicit bias arise (Gollwitzer & Sheeran, 2006; Mendoza et al., 2010; Stewart & Payne, 2008; Webb et al., 2012).

References

Ashburn-Nardo, L., Knowles, M. L., & Monteith, M. J. (2003). Black Americans’ implicit racial associations and their implications for intergroup judgment. Social Cognition, 21(1), 61–87. https://doi.org/10.1521/soco.21.1.61.21192

Bertrand, M., & Mullainathan, S. (2003). Are Emily and Greg more employable than Lakisha and Jamal? A field experiment on labor market discrimination. In NBER Working Papers (No. 9873; NBER Working Papers). National Bureau of Economic Research, Inc. https://ideas.repec.org/p/nbr/nberwo/9873.html

Brownstein, M., & Edward, Z. (2019). Implicit bias. The Stanford Encyclopedia of Philosophy. https://plato.stanford.edu/archives/fall2019/entries/implicit-bias/

Dasgupta, N. (2004). Implicit Ingroup favoritism, outgroup favoritism, and their behavioral manifestations. Social Justice Research, 17, 143–169. https://doi.org/10.1023/B:SORE.0000027407.70241.15

Gollwitzer, P., & Sheeran, P. (2006). Implementation intentions and goal achievement: A meta-analysis of effects and processes. Advances in Experimental Social Psychology, 38, 69-119. https://doi.org/10.1016/S0065-2601(06)38002-1

Greenwald, A. G., & Banaji, M. R. (1995). Implicit social cognition: Attitudes, self-esteem, and stereotypes. Psychological Review, 102(1), 4–27. https://doi.org/10.1037/0033-295x.102.1.4

Harari, H., & McDavid, J. W. (1973). Name stereotypes and teachers’ expectations. Journal of Educational Psychology, 65(2), 222–225. https://doi.org/10.1037/h0034978

Huntsinger, J. R., Sinclair, S., Dunn, E., & Clore, G. L. (2010). Affective regulation of stereotype activation: It’s the (accessible) thought that counts. Personality & Social Psychology Bulletin, 36(4), 564–577. https://doi.org/10.1177/0146167210363404

Mann, N. H., & Kawakami, K. (2012). The long, steep path to equality: Progressing on egalitarian goals. Journal of Experimental Psychology: General, 141(1), 187–197. https://doi.org/10.1037/a0025602

Mendoza, S. A., Gollwitzer, P. M., & Amodio, D. M. (2010). Reducing the expression of implicit stereotypes: Reflexive control through implementation intentions. Personality and Social Psychology Bulletin, 36(4), 512–523. https://doi.org/10.1177/0146167210362789

Monteith, M. J. (1993). Self-regulation of prejudiced responses: Implications for progress in prejudice-reduction efforts. Journal of Personality and Social Psychology, 65(3), 469–485. https://doi.org/10.1037/0022-3514.65.3.469

Monteith, M. J., Ashburn-Nardo, L., Voils, C. I., & Czopp, A. M. (2002). Putting the brakes on prejudice: On the development and operation of cues for control. Journal of Personality and Social Psychology, 83(5), 1029–1050. https://doi.org/10.1037/0022-3514.83.5.1029

Moskowitz, G., & Li, P. (2011). Egalitarian goals trigger stereotype inhibition: A proactive form of stereotype control. Journal of Experimental Social Psychology, 47, 103–116. https://doi.org/10.1016/j.jesp.2010.08.014

Moss-Racusin, C. A., Dovidio, J. F., Brescoll, V. L., Graham, M. J., & Handelsman, J. (2012). Science faculty’s subtle gender biases favor male students. Proceedings of the National Academy of Sciences, 109(41), 16474–16479. https://doi.org/10.1073/pnas.1211286109

Nosek, B. A., Banaji, M. R., & Greenwald, A. G. (2002). Harvesting implicit group attitudes and beliefs from a demonstration web site. Group Dynamics: Theory, Research, and Practice, 6(1), 101–115. https://doi.org/10.1037/1089-2699.6.1.101

Stewart, B. D., & Payne, B. K. (2008). Bringing automatic stereotyping under control: Implementation intentions as efficient means of thought control. Personality & Social Psychology Bulletin, 34(10), 1332–1345. https://doi.org/10.1177/0146167208321269

Webb, T., Sheeran, P., & Pepper, J. (2012). Gaining control over responses to implicit attitude tests: Implementation intentions engender fast responses on attitude-incongruent trials. The British Journal of Social Psychology / the British Psychological Society, 51, 13–32. https://doi.org/10.1348/014466610X532192