21 Historically, Persistently, or Systemically Marginalized (HPSM) Groups
In addition to the terms defined in the “Power and Privilege Framework” on the previous page, the concepts of bias and barriers are integral for identifying and articulating our individual, interpersonal and organizational/institutional relationships to power, privilege, oppression, and marginalization, including what individuals and leaders can do to support EDI within a large system like UBC (see image below).
Levels of Scale
Adapted from Royal Dutch Shell plc, “Diversity and Inclusion in Shell” [company brochure], 2011, from Jonsen, K. and Özbilgin, M. (2013) Chapter 12: Models of Global Diversity Management. In Ferdman, B.M., & Deane, B.R. (Eds.). Diversity at work: The practice of inclusion (Vol. 33). John Wiley & Sons. (p.377)
These nested circles help to illustrate the levels of scale that are involved when embedding more inclusive practices within the research ecosystem at UBC. For instance, each member of the research team is responsible for learning about themselves, including the biases and assumptions they hold, what types of power and privilege they bring to the research project, and how they can be more curious when learning about others. Similarly, all members of the research team will need to examine their interpersonal skills to help build a more inclusive research team culture by identifying and challenging microaggressions and barriers within the research setting, and devising more opportunities to engage research team members regardless of status within the team or university. Lastly, as a team lead, research/lab manager, and/or primary investigator you hold the most power and status within the university context and have the ability to transform inequitable structures, challenge institutional barriers, and mentor those with less power and status as they move through their career, program, or degree journeys. These different scales require different perspectives, actions, and engagement from all members of the research team, which is why we have included this graphic here. Each of the behaviours described above will help to frame our approach to bias and the barriers in the next section and throughout the course.
Cognitive biases: What are they and why do they matter in research?
Everyone holds biases; many of these are based on our own lived experience(s), where we grew up, what our families valued, how our friends and teachers treated us, and the ways media mirrored (or did not mirror) these shared experiences. In other words, ‘frame of reference’. The frame can include “beliefs, schemas, preferences, values, culture and other ways in which we bias our understanding and judgment” (Neighbours, 2015 np). Often our approach and response to different situations is based on the information that fits within our frame of reference. As a result, our brains have created short cuts for managing the 400 billion bits of data received every second, but only 0.01% of that is done consciously (Moghaddam, Chen, & Deshmukh, 2020). These shortcuts become the cognitive biases that help us move through the overwhelm of information we experience every moment of every day.
To help illustrate the overarching impact that biases can have, the table below provides examples of how cognitive biases can impact the selection of graduate students, focusing on four different domains: 1) Too much information; 2) Not enough meaning; 3) Need to act fast; 4) Deciding what to remember. The table also outlines the corresponding impact/outcome that occurs when we rely on a bias from each domain (see Benson, 2016, for a detailed illustration).
Too much information
Information overload is draining, so we aggressively filter. Noise becomes a signal. Example: You need to review 100 graduate student applications to identify who will work on your research team. Applications are 6-page CVs and 2-page cover letters. Of the 100 applications you know some of the students and have worked with others. How do you sift through all the information provided to make a meaningful and inclusive decision? Outcome: We don’t (and can’t) see everything. Some of the information we filter out is actually useful and important. |
Not enough meaning
Lack of meaning is confusing, so we fill in the gaps. Signal becomes a story. Example: You have posted two graduate student openings on your research team and have circulated the call for applications widely across the university, but all you receive from applicants is a 1-page cover letter and their GPA, how do you differentiate between applications to make a meaningful and inclusive decision? Outcome: Our search for meaning can conjure illusions. We sometimes imagine details that were filled in by our assumptions, and construct meaning and stories that aren’t really there. |
Need to act fast
Need to act fast lest we lose our chance, so we jump to conclusions. Stories become decisions. Example: You’ve just been awarded a grant, but with tight turn-around times to implement the proposed research. The funder has requested a list of team members by the end of the day, meaning you have 5 hours to decide who to include – how do you make an inclusive and meaningful decision in a short amount of time? Outcome: Quick decisions can be seriously flawed. Some of the quick reactions and decisions we jump to are unfair, self-serving, and counter-productive. |
What to remember
We try to remember the important bits. Decisions inform our mental models of the world. Example: In a rush to review both long (2-page cover letter and 6-page CV) and short (1-page cover letter & GPA) applications, and with only an hour before a deadline, you can only remember the first 10 and last 3 applications – how do you make an inclusive and meaningful decision before the deadline? Outcome: Our memory reinforces errors. Some of the information we retain for later just makes all of the above systems more biased, and more damaging to our thought processes. |
Adapted from: Benson, B. (2016). Cognitive bias cheat sheet. Better Humans, 1.
Feedback/Errata