{"id":129,"date":"2023-10-01T23:38:24","date_gmt":"2023-10-02T03:38:24","guid":{"rendered":"https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/?post_type=chapter&#038;p=129"},"modified":"2023-10-22T16:44:53","modified_gmt":"2023-10-22T20:44:53","slug":"lesson-1-exploring-algorithmic-biases","status":"publish","type":"chapter","link":"https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/chapter\/lesson-1-exploring-algorithmic-biases\/","title":{"raw":"Lesson 5: Exploring Algorithmic Biases","rendered":"Lesson 5: Exploring Algorithmic Biases"},"content":{"raw":"<h1><strong>Introduction<\/strong><\/h1>\r\n<ol>\r\n \t<li>We encounter many different types of biases on many different levels (e.g. personal, social, etc.). One of the most common cognitive biases that many of us might experience is confirmation bias, i.e. looking for information that supports our existing beliefs.<\/li>\r\n \t<li>However, when it comes to algorithm biases, their effects can have far-reaching effects on society as a whole.<\/li>\r\n<\/ol>\r\n<ul>\r\n \t<li>\u00a0Introduce students to the definition of algorithmic bias.<\/li>\r\n<\/ul>\r\n<h2><strong>Pre-activity<\/strong><\/h2>\r\n<ol>\r\n \t<li>Have you encountered any algorithmic biases? What were they?<\/li>\r\n \t<li>Complete the activity below. Did anything surprise you about the answers?<\/li>\r\n<\/ol>\r\n[h5p id=\"4\"]\r\n<h1>\u00a0Lecture: Algorithm biases<\/h1>\r\n<ol>\r\n \t<li>Explore the concept of data as a foundation for algorithms, and transfer of data biases into the biases of algorithmic systems, e.g. crime predicting software will target Black and Latino neighbourhoods in the United States, based on the biases of historical data (O\u2019Neil, 2016).<\/li>\r\n \t<li>Watch the TedTalk by Cathy O'Neil, entitled \"<a href=\"https:\/\/www.youtube.com\/watch?v=_2u_eHHzRto\">The era of blind faith in big data must end<\/a>\" [Transcript available].[embed]https:\/\/www.youtube.com\/watch?v=_2u_eHHzRto[\/embed]<\/li>\r\n<\/ol>\r\n<div class=\"textbox shaded\">\r\n\r\nAttribution: O\u2019Neil, C.<span style=\"font-size: 0.9em\">\u00a0(2017, September 7). <\/span><i style=\"font-size: 0.9em\">The era of blind faith in big data must end. <\/i><a style=\"font-size: 0.9em\" href=\"https:\/\/www.youtube.com\/watch?v=_2u_eHHzRto\">https:\/\/www.youtube.com\/watch?v=_2u_eHHzRto<\/a>\r\n\r\n<\/div>\r\n3. Watch this video by Joy Buolamwin, <a href=\"https:\/\/www.youtube.com\/watch?v=162VzSzzoPs\">The Coded Gaze: Unmasking Algorithmic Bias<\/a>.\r\n\r\n[embed]https:\/\/www.youtube.com\/watch?v=162VzSzzoPs[\/embed]\r\n<div class=\"textbox shaded\">Attribution: <span style=\"font-size: 0.9em\">Buolamwini, J. (2016, November 6). <\/span><i style=\"font-size: 0.9em\">The Coded Gaze: Unmasking Algorithmic Bias<\/i><span style=\"font-size: 0.9em\">. <\/span><a style=\"font-size: 0.9em\" href=\"https:\/\/www.youtube.com\/watch?v=162VzSzzoPs\">https:\/\/www.youtube.com\/watch?v=162VzSzzoPs<\/a><\/div>\r\n<h1><strong>Activity 1. Experience how the algorithm will judge your face.<\/strong><\/h1>\r\n<em>This activity will require the use of a camera, however, your personal data will not be collected.<\/em>\r\n<ol>\r\n \t<li>Go to <a href=\"https:\/\/www.hownormalami.eu\/\">https:\/\/www.hownormalami.eu\/<\/a> and follow the steps by allowing the algorithm to evaluate you based on its criteria. <img class=\"wp-image-132 alignright\" src=\"https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-content\/uploads\/sites\/2102\/2023\/10\/Quote-Unit-2-lesson-1-1024x950.png\" alt=\"\" width=\"296\" height=\"275\" \/><\/li>\r\n \t<li>Explore the impact of this activity on your understanding of algorithmic biases.<\/li>\r\n<\/ol>\r\n<h2><strong>Discussion questions:<\/strong><\/h2>\r\n<ol>\r\n \t<li>Were you aware of the impact of algorithms on specific groups of people?<\/li>\r\n \t<li>Can you think of other real-world examples where algorithms have had unintended consequences?<\/li>\r\n \t<li>What did you learn about algorithmic biases by completing activity 1?<\/li>\r\n \t<li>How will this impact your views of algorithmic systems in the future?<\/li>\r\n<\/ol>\r\n<h1>Self-assessment<\/h1>\r\n[h5p id=\"5\"]","rendered":"<h1><strong>Introduction<\/strong><\/h1>\n<ol>\n<li>We encounter many different types of biases on many different levels (e.g. personal, social, etc.). One of the most common cognitive biases that many of us might experience is confirmation bias, i.e. looking for information that supports our existing beliefs.<\/li>\n<li>However, when it comes to algorithm biases, their effects can have far-reaching effects on society as a whole.<\/li>\n<\/ol>\n<ul>\n<li>\u00a0Introduce students to the definition of algorithmic bias.<\/li>\n<\/ul>\n<h2><strong>Pre-activity<\/strong><\/h2>\n<ol>\n<li>Have you encountered any algorithmic biases? What were they?<\/li>\n<li>Complete the activity below. Did anything surprise you about the answers?<\/li>\n<\/ol>\n<div id=\"h5p-4\">\n<div class=\"h5p-iframe-wrapper\"><iframe id=\"h5p-iframe-4\" class=\"h5p-iframe\" data-content-id=\"4\" style=\"height:1px\" src=\"about:blank\" frameBorder=\"0\" scrolling=\"no\" title=\"Algorithmic biases: fiction or truth?\"><\/iframe><\/div>\n<\/div>\n<h1>\u00a0Lecture: Algorithm biases<\/h1>\n<ol>\n<li>Explore the concept of data as a foundation for algorithms, and transfer of data biases into the biases of algorithmic systems, e.g. crime predicting software will target Black and Latino neighbourhoods in the United States, based on the biases of historical data (O\u2019Neil, 2016).<\/li>\n<li>Watch the TedTalk by Cathy O&#8217;Neil, entitled &#8220;<a href=\"https:\/\/www.youtube.com\/watch?v=_2u_eHHzRto\">The era of blind faith in big data must end<\/a>&#8221; [Transcript available].<iframe loading=\"lazy\" id=\"oembed-1\" title=\"The era of blind faith in big data must end | Cathy O&#39;Neil\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/_2u_eHHzRto?feature=oembed&#38;rel=0\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\"><\/iframe><\/li>\n<\/ol>\n<div class=\"textbox shaded\">\n<p>Attribution: O\u2019Neil, C.<span style=\"font-size: 0.9em\">\u00a0(2017, September 7). <\/span><i style=\"font-size: 0.9em\">The era of blind faith in big data must end. <\/i><a style=\"font-size: 0.9em\" href=\"https:\/\/www.youtube.com\/watch?v=_2u_eHHzRto\">https:\/\/www.youtube.com\/watch?v=_2u_eHHzRto<\/a><\/p>\n<\/div>\n<p>3. Watch this video by Joy Buolamwin, <a href=\"https:\/\/www.youtube.com\/watch?v=162VzSzzoPs\">The Coded Gaze: Unmasking Algorithmic Bias<\/a>.<\/p>\n<p><iframe loading=\"lazy\" id=\"oembed-2\" title=\"The Coded Gaze: Unmasking Algorithmic Bias\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/162VzSzzoPs?feature=oembed&#38;rel=0\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\"><\/iframe><\/p>\n<div class=\"textbox shaded\">Attribution: <span style=\"font-size: 0.9em\">Buolamwini, J. (2016, November 6). <\/span><i style=\"font-size: 0.9em\">The Coded Gaze: Unmasking Algorithmic Bias<\/i><span style=\"font-size: 0.9em\">. <\/span><a style=\"font-size: 0.9em\" href=\"https:\/\/www.youtube.com\/watch?v=162VzSzzoPs\">https:\/\/www.youtube.com\/watch?v=162VzSzzoPs<\/a><\/div>\n<h1><strong>Activity 1. Experience how the algorithm will judge your face.<\/strong><\/h1>\n<p><em>This activity will require the use of a camera, however, your personal data will not be collected.<\/em><\/p>\n<ol>\n<li>Go to <a href=\"https:\/\/www.hownormalami.eu\/\">https:\/\/www.hownormalami.eu\/<\/a> and follow the steps by allowing the algorithm to evaluate you based on its criteria. <img loading=\"lazy\" decoding=\"async\" class=\"wp-image-132 alignright\" src=\"https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-content\/uploads\/sites\/2102\/2023\/10\/Quote-Unit-2-lesson-1-1024x950.png\" alt=\"\" width=\"296\" height=\"275\" srcset=\"https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-content\/uploads\/sites\/2102\/2023\/10\/Quote-Unit-2-lesson-1-1024x950.png 1024w, https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-content\/uploads\/sites\/2102\/2023\/10\/Quote-Unit-2-lesson-1-300x278.png 300w, https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-content\/uploads\/sites\/2102\/2023\/10\/Quote-Unit-2-lesson-1-768x713.png 768w, https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-content\/uploads\/sites\/2102\/2023\/10\/Quote-Unit-2-lesson-1-65x60.png 65w, https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-content\/uploads\/sites\/2102\/2023\/10\/Quote-Unit-2-lesson-1-225x209.png 225w, https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-content\/uploads\/sites\/2102\/2023\/10\/Quote-Unit-2-lesson-1-350x325.png 350w, https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-content\/uploads\/sites\/2102\/2023\/10\/Quote-Unit-2-lesson-1.png 1111w\" sizes=\"auto, (max-width: 296px) 100vw, 296px\" \/><\/li>\n<li>Explore the impact of this activity on your understanding of algorithmic biases.<\/li>\n<\/ol>\n<h2><strong>Discussion questions:<\/strong><\/h2>\n<ol>\n<li>Were you aware of the impact of algorithms on specific groups of people?<\/li>\n<li>Can you think of other real-world examples where algorithms have had unintended consequences?<\/li>\n<li>What did you learn about algorithmic biases by completing activity 1?<\/li>\n<li>How will this impact your views of algorithmic systems in the future?<\/li>\n<\/ol>\n<h1>Self-assessment<\/h1>\n<div id=\"h5p-5\">\n<div class=\"h5p-iframe-wrapper\"><iframe id=\"h5p-iframe-5\" class=\"h5p-iframe\" data-content-id=\"5\" style=\"height:1px\" src=\"about:blank\" frameBorder=\"0\" scrolling=\"no\" title=\"Unit 3: self-assessment\"><\/iframe><\/div>\n<\/div>\n","protected":false},"author":1763,"menu_order":1,"template":"","meta":{"pb_show_title":"on","pb_short_title":"","pb_subtitle":"","pb_authors":[],"pb_section_license":""},"chapter-type":[49],"contributor":[],"license":[],"class_list":["post-129","chapter","type-chapter","status-publish","hentry","chapter-type-numberless"],"part":64,"_links":{"self":[{"href":"https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-json\/pressbooks\/v2\/chapters\/129","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-json\/pressbooks\/v2\/chapters"}],"about":[{"href":"https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-json\/wp\/v2\/types\/chapter"}],"author":[{"embeddable":true,"href":"https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-json\/wp\/v2\/users\/1763"}],"version-history":[{"count":9,"href":"https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-json\/pressbooks\/v2\/chapters\/129\/revisions"}],"predecessor-version":[{"id":172,"href":"https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-json\/pressbooks\/v2\/chapters\/129\/revisions\/172"}],"part":[{"href":"https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-json\/pressbooks\/v2\/parts\/64"}],"metadata":[{"href":"https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-json\/pressbooks\/v2\/chapters\/129\/metadata\/"}],"wp:attachment":[{"href":"https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-json\/wp\/v2\/media?parent=129"}],"wp:term":[{"taxonomy":"chapter-type","embeddable":true,"href":"https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-json\/pressbooks\/v2\/chapter-type?post=129"},{"taxonomy":"contributor","embeddable":true,"href":"https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-json\/wp\/v2\/contributor?post=129"},{"taxonomy":"license","embeddable":true,"href":"https:\/\/pressbooks.bccampus.ca\/algorithmicawarenesstoolkit\/wp-json\/wp\/v2\/license?post=129"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}