Introduction to unconscious bias
What is unconscious bias?
We may not be aware of it, but we often place people into categories based on age, religion, race, gender and politics. This is unconscious bias - sometimes called implicit bias. We all have unconscious bias and it’s a natural and necessary survival mechanism that allows us to quickly assess situations, and decide promptly what action we should take, based on beliefs and previous experiences. It would be difficult to function if we approached every situation as if it were entirely new and had never been encountered before.
However, problems can arise when we make incorrect decisions based on flawed assumptions, beliefs and experiences. Being aware of our own biases and making changes in our work can help us minimise the risk of making poor decisions. There are three main types of unconscious bias:
Affinity bias - could cause us to recruit or promote people who look similar to us, or have similar language, names or culture. We might be more likely to view people like us (or who we like) positively, and more likely to notice negative traits of people unlike us, or who we don't like.
Confirmation bias - we might unconsciously look for evidence that supports our pre-existing beliefs, and ignore evidence that contradicts it.
Social comparison bias - we may be more critical of people who we are in competition with, or who we see as potentially better at things than we are. We might not be aware of this, and it might affect how fairly we treat others who we feel threatened by.
In the short video Understanding Unconscious Bias, The Royal Society presents how the human brain works and how our natural propensity to make quick, unconscious decisions and judgments in the moment can lead to us making incorrect decisions and assumptions. It covers in-groups and out-groups, where some groups of people favour others who they feel are like them.
Other videos in the further information section below explore other perspectives on unconscious bias, including assumptions about gender, recruitment and employment, and how embracing diversity in our workforce helps us to provide better products and services.
You can identify your own biases using the Harvard Implicit Bias Test - a series of short, fun and engaging online tests designed to help identify where we might have unconscious biases on age, religion, sexual orientation, disability, trans people and race. A consideration of the limits and interpretations of such tests, in particular to measure implicit sexual attitudes in heterosexual, gay and bisexual individuals, can be found in this journal article by Anselmi et al (2013).
Reflecting on unconscious bias might help us to explore:
- Why 60% of biology undergraduates are female, but only 30% of biology academics are female at York, while nationally only 20% of biology academics are female
- How we can support more women, and Black, Asian and Minority Ethnic (BAME*) staff with career progression, and address our gender and ethnicity pay gaps at York
- Why there is a need to diversify and decolonise curricula
- Why some voices are louder, and some people are more visible than others
The impact of unconscious bias at the University might mean that we don't recruit, promote, nurture or include people fairly and consistently. Lack of role models may affect staff retention and student recruitment. We might see fewer women in senior academic roles, even though they are more numerous at undergraduate level, and we may see fewer Black, Asian and Minority Ethnic staff in senior leader roles. It might affect staff retention, so staff might seek career development elsewhere. It might affect how staff feel about working at the University and with their colleagues.
For example, take a look at our Gender and Ethnicity pay gap report for 2022, which highlights pay and progression disparities between groups. A couple of illustrations are provided below, which show that we have fewer female than male staff in senior positions, and the pay disparity between staff from minority ethnic backgrounds and white staff.
Table: University BAME pay gap
Mean (average) hourly rate pay gap |
Median (middle) hourly rate pay gap |
|
---|---|---|
2022 | 14.5% | 20.9% |
2021 | 14.8% | 18.6% |
The mean pay gap between a BAME member of staff and a white member of staff was 14.8% in 2021 and in 2022 is 14.5%. The median was 18.6% in 2021 and in 2022 is 20.9%.
Table: University ethnicity pay gap 2022
Ethnicity | Mean (average) hourly rate pay gap |
Median (middle) hourly rate pay gap |
---|---|---|
Black | 24.5% | 34.0% |
Asian | 14.4% | 20.9% |
Other | 11.5% | 7.1% |
We might be biased in our recruitment processes, and appoint people who we think are like us, who we see as less threatening to us, or who have views that coincide with our own.
We might unconsciously run social events that only attract part of our staff community, making them feel welcome and included, but making others feel excluded and less welcome, and preventing some people from developing strong personal and professional relationships, networks and associations.
In meetings we might not listen to some groups of people as much as we listen to others, and some people might not be present at all. For example, younger people may be less likely to be asked for their opinion on issues, or brought into the conversation if they are not contributing as much, and our meetings might not reflect the diversity of staff across the University. We may be unaware this is happening.
People with particular characteristics might not be as visible as others, especially in senior leadership roles, which may unconsciously and negatively affect how staff and students view the suitability of people, including themselves, for particular roles (see Equality Challenge Unit, 2013, Unconscious Bias and Higher Education, for an exploration of the research on this).
* When using the acronym/term ‘BAME’ in our work we recognise that this does not fully capture the nuance and experiences of different ethnic identities, including the individual and cultural challenges faced by people in these communities.
For pay gap analysis, data is aggregated into a broad group including Black, Asian and minority ethnic identities to enable analysis of the difference in experience compared to the White majority. Aggregated groups do not fully reflect the complex and nuanced experiences of individuals included in these data groups.
For more information on language use please see the University's Using appropriate language when referring to race and ethnicity guidance.
Katie Oates: the University's approach
Katie Oates, Development Partner, explains the University’s dual approach to tackling unconscious bias - encouraging awareness of unconscious bias within individuals, and encouraging systemic change in our working practices.
Katie Oates, Development Partner for Talent, People and Organisational Development, Human Resources, University of York
Katie explains the University’s dual approach to tackling unconscious bias - encouraging awareness of unconscious bias within individuals, and encouraging systemic change in our working practices, in light of critiques that suggest individual awareness is insufficient on its own to lead to significant change. We support the sharing of learning, so that departments can learn from the excellent work that others have done, and think about ways to improve their own processes.
Staff are encouraged to ask themselves some critical questions about bias, including:
- Are you vigilant and curious about the objectivity of your decision making?
- Confirmation bias - are you influenced by information that supports existing beliefs? Do you make efforts to listen to arguments and opinions that differ from your own?
- How does your emotional state affect your decision making? Can your critical decision making be scheduled so that it isn’t overwhelmed by other priorities?
- In-groups and out-groups - are you making efforts to include all people in events and activities?
- What systems and processes in your departments could be changed to counteract bias? For example, how can you slow down decision making and give people more time to make judgements? How can you reduce mental fatigue?
- Challenging stereotypes - how can we influence what people see and hear in departments, and counteract stereotypes?
- How can we make decisions objectively, rather than on subjective feelings?
- What steps can we build into recruitment to ensure that sound decisions are made, such as providing sufficient time for discussion and reasoned judgments, and can we evidence that we judge people fairly against criteria agreed in advance?
Katie suggests we should also be proactive in recognising, and respectfully challenging, biases we observe in others, which we might be more likely to notice than our own.
Katie has a long standing background and interest in tackling bias in employment:
“I have long been fascinated by human behaviour, emotions and thought processes, and as a chartered organisational psychologist I have a keen interest in the application of behavioural science to improve the fairness and objectivity of decision making in the workplace. It’s so important that we are respected, included, valued and treated fairly at work. During my time at the College of Policing I worked to design and deliver national selection processes for police recruitment and promotions, and saw first-hand the benefits of both supporting assessors to spot and explore their own biases (it can be quite a shift for people to understand that we are not as rational as we like to believe!) and the impact of designing environments, systems and processes that help mitigate bias in decision making. At York, I enjoy supporting the development of staff, including models and skills for objective assessment during staff selection processes.”
Katie highly recommends Daniel Kahnman's 'Thinking Fast and Slow' to gain insight into biases, errors in judgments and how to make better decisions. Written in an engaging way, it's available in the University library (see our information sources section for more information).
At the University we encourage staff to be aware of, limit and mitigate their unconscious biases.
In the past, we have provided training modules on unconscious bias awareness and unconscious bias in the recruitment and selection process.
However, one criticism of traditional unconscious bias training and awareness raising is that it concentrates on eliciting change in the individual in the belief that being aware of our biases is enough to enable us to overcome them. But critics suggest that biases can be deeply ingrained in our systems and processes so that being aware of our biases is not sufficient to enable change. That’s why we are taking a dual approach to tackling unconscious bias.
We do actively encourage individuals to be aware of their biases, but we also encourage systemic change by showing good practice across university so that departments can learn from the great work that others have undertaken. So taking this dual approach, what are some of the steps that we can take?
To build awareness of your biases, be vigilant and curious about the objectivity of your decision making.
When you’re gathering information, are you being swayed by content that supports your existing beliefs?
Are you aware of your emotional state and is it affecting your decision making?
Can you remind yourself to pay close attention to counter arguments and opinions that differ to your own?
And if you are organising a networking event, is it catering more to people in your IN group than your OUT group?
In terms of systemic changes, behavioural science research indicates that we can best to mitigate bias by altering the environment in which we make decisions to make ourselves less susceptible to the bias.
Think about how you could alter systems and processes in your department to do this.
How can you slow down decision making and give people more time to make judgements?
How can you reduce mental fatigue?
How can you alter what people see and hear in the workplace so that inaccurate stereotypes are not reinforced but are countered?
How can we help ourselves and others to make decisions based on objective evidence rather than subjective feelings or inaccurate or incomplete memories of events?
For example, when planning meeting agendas, try to plan in sufficient time and place more complex issues at the start of the meeting.
If you’re recruiting, create a timetable and structure that gives assessors sufficient time to make reasoned logical judgements rather than snap decisions.
Try to schedule critical decision making when you are more energised and not overwhelmed by other decisions that you’ve been making earlier in the day.
Similarly, research shows that it is really important to manage your blood sugar to avoid mental fatigue, another reason not to power through lunch or miss breaks.
And if you are making any kind of assessment or judgement of another’s performance, clearly set out your assessment criteria in advance and make thorough accurate notes of your performance observations.
Evidence suggests that we are much better at spotting others biases than our own. So help build a culture where we support each other by calling out biases when we see them and are open to learning about our own.
There are many more steps that we can take to mitigate bias, so do share learning and insight from your department with others.
Katie Oates (University of York, HR Development Partner) provides a great overview of the ORCE method of assessment in recruitment and selection, and how we can overcome typical biases that we may encounter during the recruitment process.
Bodies beyond sex – what being intersex taught me about the world - Ronja Ziesel, TEDx Talks (15:03). A longer but captivating and insightful TEDx talk from an intersex person called Ronja Ziesel. It covers how people are socialised to make assumptions about people based on stereotypical expectations of sex, gender,and appearance. If you don’t have time to watch the video, the section from 02:23 to 04:00 gives a humorous and enlightening illustration of this process.
What is unconscious bias? - The Employers Network for Equality and Inclusion (03:20). Introduces unconscious bias, and the neuroscience behind it. This covers: how people place individuals into social categories based on visual cues gender, age, cultural background, and also on other grounds such as social backgrounds and job roles; how we become wired, or trained, by cultural stereotypes, such as only seeing women or men in certain types of job; and how affinity bias affects how managers may be more likely to favour certain individuals in recruitment and promotion where there is some sort of affinity based on personal characteristics, and behave negatively towards those where there is little affinity.
Unconscious Bias at Work - Making the Unconscious Conscious - Life at Google (03:58). Covers how people have a tendency to design our services, systems, and products to favour some individuals without being aware of it, because of our personal frame of reference and assumptions. This can affect accessibility of resources, where we may assume others know the things we know, or can do the things we can do. Ultimately, being aware of our biases and having a wider perspective can help us develop more effective and successful initiatives that meet people’s needs better.
Unconscious Bias and Higher Education, Equality Challenge Unit (2013, now AdvanceHE) provides a broad literature review of a wide range of studies into unconscious bias, much of which is applicable to the Higher Education context, including recruitment biases (Steinpreis, 1999; Wood et al, 2009; Moss-Racusin et al, 2012), and how we might tackle bias by challenging stereotypes, provide broader representation in all that we do, and increase contact between diverse groups of people.
Complete your learning
Now that you've learnt more about unconscious bias, you can go to the LMS and complete the quiz to demonstrate you have completed this learning.You might also like to complete an action plan to record anything that you intend to take back to your department or service area for discussion.