{"id":22033,"date":"2021-04-05T14:45:05","date_gmt":"2021-04-05T21:45:05","guid":{"rendered":"https:\/\/www.pbs.org\/independentlens\/?post_type=blog&#038;p=22033"},"modified":"2023-09-19T13:58:13","modified_gmt":"2023-09-19T20:58:13","slug":"coded-bias-asks-are-our-faces-being-used-against-us","status":"publish","type":"blog","link":"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/","title":{"rendered":"Coded Bias&#8221; Asks: Are Our Faces Being Used Against Us?"},"content":{"rendered":"<p><strong>By Christina Sturdivant Sani<\/strong><\/p>\n<hr \/>\n<p><span style=\"font-weight: 400;\"><a href=\"https:\/\/www.pbs.org\/independentlens\/documentaries\/coded-bias\/\"><em><strong>Coded Bias<\/strong><\/em><\/a>\u00a0begins with <\/span><span style=\"font-weight: 400;\">MIT researcher <strong>Joy Boulamwini\u2019<\/strong>s <\/span><span style=\"font-weight: 400;\">quest to find out why facial recognition technology <\/span><span style=\"font-weight: 400;\">inaccurately viewed her beautifully-hued brown face. After viewing the documentary<i> <\/i>with my friend India, a UX designer, and my husband Hamzat, the retail director at a Washington, D.C.-based boutique that centers around Black artisans, we were left <\/span><span style=\"font-weight: 400;\">with myriad questions about our relationships with technology and how race, class, and social structures influence our ties to tech. Our post-documentary conversation eventually led us to poll nearly two dozen friends to see how their experiences compared to ours (keep reading for the results of our unscientific yet informative survey).\u00a0<\/span><!--more--><\/p>\n<p><span style=\"font-weight: 400;\">As mid-30-somethings, we grew up in the time of dial-up internet and landline phones. If we missed a call back then, we\u2019d have to check the caller-ID or dial *69. I didn\u2019t get my first cellphone\u2014a silver flip Samsung\u2014until I was a senior in high school. My son, in comparison, received his first cell at 8 years old.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">But when the Pew Research Center began tracking Americans\u2019 internet usage in early 2000, about half of all adults were already online. By 2019, <\/span><span style=\"font-weight: 400;\">nine out of every 10<\/span><span style=\"font-weight: 400;\"> adults in the U.S. were using the internet, according to the center\u2019s data. Moreover, smartphones have become the primary way to access the internet at home for a growing number of Americans, according to Pew, which dubs roughly one-in-five American adults as \u201csmartphone-only\u201d internet users.\u00a0<\/span><\/p>\n<blockquote class=\"twitter-tweet\" data-width=\"500\" data-dnt=\"true\">\n<p lang=\"en\" dir=\"ltr\">&quot;It turned out these <a href=\"https:\/\/twitter.com\/hashtag\/FacialRecognitionAlgorithms?src=hash&amp;ref_src=twsrc%5Etfw\">#FacialRecognitionAlgorithms<\/a> perform better on the male faces than the female faces. They perform significantly better on lighter faces than the darker faces&quot; &#8211; <a href=\"https:\/\/twitter.com\/jovialjoy?ref_src=twsrc%5Etfw\">@jovialjoy<\/a> <a href=\"https:\/\/t.co\/ye6tIre5Zs\">pic.twitter.com\/ye6tIre5Zs<\/a><\/p>\n<p>&mdash; Coded Bias Documentary (@CodedBias) <a href=\"https:\/\/twitter.com\/CodedBias\/status\/1352315523695054849?ref_src=twsrc%5Etfw\">January 21, 2021<\/a><\/p><\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p><span style=\"font-weight: 400;\">For many Americans, the advancement of technology made our lives simpler and more convenient. Though we remember a time when encyclopedias were all the rage, we can\u2019t fathom a world in which a quick Google search wouldn&#8217;t render dozens of results to our most random of curiosities. <\/span><span style=\"font-weight: 400;\">As a journalist, I tune in to my favorite morning news podcast via my phone on a daily basis. Then I check my news aggregator app to see what I missed. I research stories from my phone, contact sources, and share my published articles from the same device.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Both India and Hamzat use facial recognition technology to open their iPhones. India also uses the feature to open bank apps. Meanwhile, Hamzat uses his likeness to pay for sneakers on apps like <\/span><span style=\"font-weight: 400;\">SNKRS and to text <\/span><span style=\"font-weight: 400;\">Memojies<\/span><span style=\"font-weight: 400;\"> of himself to our 12-year-old son.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">But as essential a tool as our phones may be, the misuse of their embedded technology can have dangerous outcomes. Before watching <\/span><i><span style=\"font-weight: 400;\">Coded Bias<\/span><\/i><span style=\"font-weight: 400;\">, we thought little of how <\/span><span style=\"font-weight: 400;\">facial recognition systems use algorithms to identify faces, and how those algorithms are created by humans\u2014typically white men\u2014who have cognitive biases.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">As we wait for legislation that would govern against bias in the algorithms that influence our daily lives, <\/span><span style=\"font-weight: 400;\">tech companies continue to sell<\/span><span style=\"font-weight: 400;\"> their facial recognition technology to law enforcement.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Given our newfound knowledge and intrigue, the three of us decided to conduct an unscientific survey to gauge how some of our friends and relatives engage with facial recognition technology. As this unregulated form of tech is becoming increasingly embedded in the lives of people across the globe, we wanted to see how people close to us use the technology, how trusting they are of it, and what apprehensions they may foster. After coming up with a few probing questions, I created the survey and we texted it out.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Within two days, we received over 20 responses. About 90% of our respondents were like us: between the ages of 25-44 and identified as Black or African American. Women dominated our sampling with more than 72% of them identifying as female.\u00a0<\/span><\/p>\n<p><b>The Findings<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Overall, our findings showed that most of our friends and relatives are familiar with facial recognition software and the majority of them don\u2019t view it as a threat. They varied more widely when determining the value they\u2019d put on their likeness if asked to use the technology for corporate branding. Here are the results in more detail:\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">When it came to respondents\u2019 daily use of facial recognition technology, half of them use it to open their phones (14% of those surveyed had phones that didn\u2019t offer that feature, like myself), and 60% use it to open apps and communicate via text and social media.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">More than 86% of folks said they <\/span><b>never used facial recognition technology in buildings or places like airports<\/b><span style=\"font-weight: 400;\">. This made us wonder how many of our friends even realize that the technology is <\/span><span style=\"font-weight: 400;\">reportedly<\/span><span style=\"font-weight: 400;\"> employed for commercial purposes such as tracking people who enter and leave apartment buildings, monitoring the attendance of employees at businesses, and seeing how people respond to ads in real-time.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">One of the most assuring responses came with the 81% of respondents who say they\u2019ve never been locked out of their devices because of facial recognition software. Hamzat found this quite hard to believe as he\u2019s struggled through multiple attempts to open his phone: \u201cI figured it was too dark, I didn\u2019t have my glasses on, the angle wasn\u2019t right, my forehead was too shiny, I had a booger in my nose, I needed to drink more water. I didn\u2019t know! I just figured it wasn\u2019t my iPhone\u2019s fault but some other despicable set of circumstances,\u201d he says, during our discussion.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Our next question, which we purposely made open-ended, surveyed <\/span><b>how much money respondents would charge a company for the use of their face or likeness<\/b><span style=\"font-weight: 400;\">. Many respondents chose fixed amounts, ranging from a couple hundred bucks all the way to $100 million. A couple of people said they would not participate for any amount of money, and one person didn\u2019t think their face would even be desired for replication. A small enterprising group considered fluctuating valuations depending on use and profit. Little do they know, tech companies are already selling their faces to the police.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Our final question offered a list of options to see <\/span><b>how our friends generally felt about facial recognition technology<\/b><span style=\"font-weight: 400;\">. Illuminating the thought that societal pressure typically reigns supreme when adopting the latest technology, more than 40% of our friends admitted to their eventual use of facial recognition technology while 27% had already succumbed to it.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Only 10% of our respondents worried it may not be inclusive of all skin tones, body types, and facial structures. A similar question posed to respondents of another Pew Research Survey found that <\/span><span style=\"font-weight: 400;\">roughly three-quarters<\/span><span style=\"font-weight: 400;\"> of U.S. adults think facial recognition technologies are \u201cat least somewhat effective\u201d at accurately identifying individual people. Just over 60% of people thought the tools are effective at accurately assessing someone\u2019s gender or race.\u00a0<\/span><\/p>\n<p><em>Joy Buolamwini&#8217;s own study on facial recognition bias:<\/em><\/p>\n<iframe loading=\"lazy\" title=\"Gender Shades\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/TWWsW1w-BVo?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<p><strong>So what does this all mean?\u00a0<\/strong><\/p>\n<p><span style=\"font-weight: 400;\">Well, we\u2019ve always known that technology comes with trade-offs but we\u2019re still discovering the reach that facial recognition has on different facets of life. We\u2019d like to think we have the power to ascribe a value to our own likeness, but without regulations, there\u2019s a good chance that our faces have already been captured, sold, or used against us.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Issues of racist and sexist bias are currently bound to the technology that we love for its convenience and entertainment. Fortunately, films like <\/span><i><span style=\"font-weight: 400;\">Coded Bias<\/span><\/i><span style=\"font-weight: 400;\"> and people like Joy are speaking truth to power and helping everyday folks claim their faces, spaces, and futures for the good of all mankind.<\/span><\/p>\n<hr \/>\n<p><span style=\"font-weight: 400;\">I\u2019ll leave you with this poem written by Hamzat, inspired by Joy Buolamwini and the great <\/span><span style=\"font-weight: 400;\">Gilbert Scott-Heron.\u00a0<\/span><\/p>\n<p style=\"padding-left: 80px;\"><strong>Artificial Revolution: Decoded<\/strong><\/p>\n<p style=\"padding-left: 80px;\"><em>The truth is you can\u2019t code your way out of racism.<\/em><br \/>\n<em>The revolution will not come tightly baked into an algorithm allowing others to see you plainly, human.<\/em><br \/>\n<em>The revolution will not come in an app that distinguishes your Black child from villain or valueless.<\/em><br \/>\n<em>You won\u2019t be able to pop into a cultural isolating soundscape with your AirPods because the revolution will not cancel noise brother,\u00a0 just enhance it.<\/em><br \/>\n<em>There will be no App the Revolution, it won\u2019t be available on the Apple Store, Google Store, Amazon store or the stores of ignorance white supremacy has allowed for.<\/em><br \/>\n<em>The revolution will be raw, uncut, unable to be loaded down or downloaded.<\/em><br \/>\n<em>The revolution will be pure, uncorrupted, unencrypted, untethered unarmed and loaded.<\/em><br \/>\n<em>The revolution will require your face, your ears, your mouth, your body<\/em><br \/>\n<em>To be yours,<\/em><br \/>\n<em>To be put in use to uphold the sacred humanity of those that stand beside and behind you.<\/em><br \/>\n<em>Facial recognition will not be your ticket in, sister. No beautifully symmetrical placement of eyes, centering of nose or thinning of lips will guarantee entry.<\/em><br \/>\n<em>Your skin, sun-starved or clay-baked will not certify you.<\/em><br \/>\n<em>The hair on your pretty little head afro\u2019d, locked, braided or bantu\u2019d, mohawked, blond, red or purpled will not satisfy the requirements of revolutionary understanding.<\/em><br \/>\n<em>The revolution will not be coded only for some to decipher in a language unrecognizable to the soul, the mind or the body of the people.<\/em><br \/>\n<em>It will not be streamed, flixed, wired or disconnected from the realities of the least of these.<\/em><br \/>\n<em>The revolution must simply be.<\/em><br \/>\n<em>The revolution is you.<\/em><br \/>\n<em>The revolution has always been me. <strong>\u2014Hamzat Sani<\/strong><\/em><\/p>\n<hr \/>\n<p>BONUS: On the subject of poetry, here&#8217;s a striking one by Joy Buolamwini herself.<\/p>\n<iframe loading=\"lazy\" title=\"AI, Ain&#039;t I A Woman? - Joy Buolamwini\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/QxuyfWoVV98?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<hr \/>\n<p>A prolific freelance journalist, <strong>Christina Sturdivant Sani<\/strong> has written for nearly two dozen publications including <em>The Washington Post<\/em>, Zagat, CityLab, <em>Here Magazine,<\/em> and <em>Washington City Paper<\/em>. She has covered a multitude of topics including art and culture, food, urbanism, crime, politics, education, commercial real estate, race relations, mental health and wellness, business, and technology.\u00a0Find her at <a href=\"https:\/\/www.seesturdi.com\/\" target=\"_blank\" rel=\"noopener noreferrer\">seesturdi.com<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>By Christina Sturdivant Sani Coded Bias\u00a0begins with MIT researcher Joy Boulamwini\u2019s quest to find out why facial recognition technology inaccurately viewed her beautifully-hued brown face. After viewing the documentary with my friend India, a UX designer, and my husband Hamzat, the retail director at a Washington, D.C.-based boutique that centers around Black artisans, we were [&hellip;]<\/p>\n","protected":false},"author":13,"featured_media":22038,"comment_status":"open","ping_status":"closed","template":"","meta":{"_acf_changed":false,"footnotes":""},"categories":[1357,1877],"tags":[],"topic":[1259,1239,1264,1983,2125],"class_list":["post-22033","blog","type-blog","status-publish","has-post-thumbnail","hentry","category-beyond-the-films","category-lifestyle","topic-civil-liberties","topic-identity","topic-race-ethnicity","topic-science","topic-technology"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.2 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Coded Bias&quot; Asks: Are Our Faces Being Used Against Us? - Independent Lens<\/title>\n<meta name=\"description\" content=\"Inspired by Joy Boulamwini and the film &quot;Coded Bias,&quot; a writer sets out to test facial recognition bias with a survey, with some surprising results.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Coded Bias Asks: Are Our Faces Being Used Against Us? | Blog | Independent Lens | PBS\" \/>\n<meta property=\"og:description\" content=\"After viewing \u201cCoded Bias,\u201d writer Christina Sturdivant Sani, asks: are our face being used against us? Within two days, she received over 20 responses. Read more.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/\" \/>\n<meta property=\"og:site_name\" content=\"Independent Lens\" \/>\n<meta property=\"article:modified_time\" content=\"2023-09-19T20:58:13+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.pbs.org\/independentlens\/wp-content\/uploads\/2021\/03\/coded-bias-looking-at-phone.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1920\" \/>\n\t<meta property=\"og:image:height\" content=\"1080\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:title\" content=\"Coded Bias Asks: Are Our Faces Being Used Against Us? | Blog | Independent Lens | PBS\" \/>\n<meta name=\"twitter:description\" content=\"After viewing \u201cCoded Bias,\u201d writer Christina Sturdivant Sani, asks: are our face being used against us? Within two days, she received over 20 responses. Read more.\" \/>\n<meta name=\"twitter:label1\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data1\" content=\"7 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/\"},\"author\":{\"name\":\"Independent Lens\",\"@id\":\"https:\/\/www.pbs.org\/independentlens\/#\/schema\/person\/4cedb3eea460cdaac69638c5d476f7bf\"},\"headline\":\"Coded Bias&#8221; Asks: Are Our Faces Being Used Against Us?\",\"datePublished\":\"2021-04-05T21:45:05+00:00\",\"dateModified\":\"2023-09-19T20:58:13+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/\"},\"wordCount\":1624,\"commentCount\":0,\"image\":{\"@id\":\"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.pbs.org\/independentlens\/wp-content\/uploads\/2021\/03\/coded-bias-looking-at-phone.jpg\",\"articleSection\":[\"Beyond the Films\",\"Lifestyle\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/\",\"url\":\"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/\",\"name\":\"Coded Bias\\\" Asks: Are Our Faces Being Used Against Us? - Independent Lens\",\"isPartOf\":{\"@id\":\"https:\/\/www.pbs.org\/independentlens\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.pbs.org\/independentlens\/wp-content\/uploads\/2021\/03\/coded-bias-looking-at-phone.jpg\",\"datePublished\":\"2021-04-05T21:45:05+00:00\",\"dateModified\":\"2023-09-19T20:58:13+00:00\",\"description\":\"Inspired by Joy Boulamwini and the film \\\"Coded Bias,\\\" a writer sets out to test facial recognition bias with a survey, with some surprising results.\",\"breadcrumb\":{\"@id\":\"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/#primaryimage\",\"url\":\"https:\/\/www.pbs.org\/independentlens\/wp-content\/uploads\/2021\/03\/coded-bias-looking-at-phone.jpg\",\"contentUrl\":\"https:\/\/www.pbs.org\/independentlens\/wp-content\/uploads\/2021\/03\/coded-bias-looking-at-phone.jpg\",\"width\":1920,\"height\":1080,\"caption\":\"A still from Coded Bias, directed by Shalini Kantayya; photo credit Steve Acevedo\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.pbs.org\/independentlens\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Posts\",\"item\":\"https:\/\/www.pbs.org\/independentlens\/blog\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"Coded Bias&#8221; Asks: Are Our Faces Being Used Against Us?\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.pbs.org\/independentlens\/#website\",\"url\":\"https:\/\/www.pbs.org\/independentlens\/\",\"name\":\"Independent Lens\",\"description\":\"Independent Documentary Films\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.pbs.org\/independentlens\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.pbs.org\/independentlens\/#\/schema\/person\/4cedb3eea460cdaac69638c5d476f7bf\",\"name\":\"Independent Lens\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/secure.gravatar.com\/avatar\/2b5c0f7775847014c2f5553ec273875f0a9d53d7393cbafef77867f9e0883487?s=96&r=g\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/2b5c0f7775847014c2f5553ec273875f0a9d53d7393cbafef77867f9e0883487?s=96&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/2b5c0f7775847014c2f5553ec273875f0a9d53d7393cbafef77867f9e0883487?s=96&r=g\",\"caption\":\"Independent Lens\"},\"url\":\"https:\/\/www.pbs.org\/independentlens\/author\/indielens\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Coded Bias\" Asks: Are Our Faces Being Used Against Us? - Independent Lens","description":"Inspired by Joy Boulamwini and the film \"Coded Bias,\" a writer sets out to test facial recognition bias with a survey, with some surprising results.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/","og_locale":"en_US","og_type":"article","og_title":"Coded Bias Asks: Are Our Faces Being Used Against Us? | Blog | Independent Lens | PBS","og_description":"After viewing \u201cCoded Bias,\u201d writer Christina Sturdivant Sani, asks: are our face being used against us? Within two days, she received over 20 responses. Read more.","og_url":"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/","og_site_name":"Independent Lens","article_modified_time":"2023-09-19T20:58:13+00:00","og_image":[{"width":1920,"height":1080,"url":"https:\/\/www.pbs.org\/independentlens\/wp-content\/uploads\/2021\/03\/coded-bias-looking-at-phone.jpg","type":"image\/jpeg"}],"twitter_card":"summary_large_image","twitter_title":"Coded Bias Asks: Are Our Faces Being Used Against Us? | Blog | Independent Lens | PBS","twitter_description":"After viewing \u201cCoded Bias,\u201d writer Christina Sturdivant Sani, asks: are our face being used against us? Within two days, she received over 20 responses. Read more.","twitter_misc":{"Est. reading time":"7 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/#article","isPartOf":{"@id":"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/"},"author":{"name":"Independent Lens","@id":"https:\/\/www.pbs.org\/independentlens\/#\/schema\/person\/4cedb3eea460cdaac69638c5d476f7bf"},"headline":"Coded Bias&#8221; Asks: Are Our Faces Being Used Against Us?","datePublished":"2021-04-05T21:45:05+00:00","dateModified":"2023-09-19T20:58:13+00:00","mainEntityOfPage":{"@id":"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/"},"wordCount":1624,"commentCount":0,"image":{"@id":"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/#primaryimage"},"thumbnailUrl":"https:\/\/www.pbs.org\/independentlens\/wp-content\/uploads\/2021\/03\/coded-bias-looking-at-phone.jpg","articleSection":["Beyond the Films","Lifestyle"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/","url":"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/","name":"Coded Bias\" Asks: Are Our Faces Being Used Against Us? - Independent Lens","isPartOf":{"@id":"https:\/\/www.pbs.org\/independentlens\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/#primaryimage"},"image":{"@id":"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/#primaryimage"},"thumbnailUrl":"https:\/\/www.pbs.org\/independentlens\/wp-content\/uploads\/2021\/03\/coded-bias-looking-at-phone.jpg","datePublished":"2021-04-05T21:45:05+00:00","dateModified":"2023-09-19T20:58:13+00:00","description":"Inspired by Joy Boulamwini and the film \"Coded Bias,\" a writer sets out to test facial recognition bias with a survey, with some surprising results.","breadcrumb":{"@id":"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/#primaryimage","url":"https:\/\/www.pbs.org\/independentlens\/wp-content\/uploads\/2021\/03\/coded-bias-looking-at-phone.jpg","contentUrl":"https:\/\/www.pbs.org\/independentlens\/wp-content\/uploads\/2021\/03\/coded-bias-looking-at-phone.jpg","width":1920,"height":1080,"caption":"A still from Coded Bias, directed by Shalini Kantayya; photo credit Steve Acevedo"},{"@type":"BreadcrumbList","@id":"https:\/\/www.pbs.org\/independentlens\/blog\/coded-bias-asks-are-our-faces-being-used-against-us\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.pbs.org\/independentlens\/"},{"@type":"ListItem","position":2,"name":"Posts","item":"https:\/\/www.pbs.org\/independentlens\/blog\/"},{"@type":"ListItem","position":3,"name":"Coded Bias&#8221; Asks: Are Our Faces Being Used Against Us?"}]},{"@type":"WebSite","@id":"https:\/\/www.pbs.org\/independentlens\/#website","url":"https:\/\/www.pbs.org\/independentlens\/","name":"Independent Lens","description":"Independent Documentary Films","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.pbs.org\/independentlens\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/www.pbs.org\/independentlens\/#\/schema\/person\/4cedb3eea460cdaac69638c5d476f7bf","name":"Independent Lens","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/2b5c0f7775847014c2f5553ec273875f0a9d53d7393cbafef77867f9e0883487?s=96&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/2b5c0f7775847014c2f5553ec273875f0a9d53d7393cbafef77867f9e0883487?s=96&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/2b5c0f7775847014c2f5553ec273875f0a9d53d7393cbafef77867f9e0883487?s=96&r=g","caption":"Independent Lens"},"url":"https:\/\/www.pbs.org\/independentlens\/author\/indielens\/"}]}},"amp_enabled":true,"_links":{"self":[{"href":"https:\/\/www.pbs.org\/independentlens\/wp-json\/wp\/v2\/blog\/22033","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.pbs.org\/independentlens\/wp-json\/wp\/v2\/blog"}],"about":[{"href":"https:\/\/www.pbs.org\/independentlens\/wp-json\/wp\/v2\/types\/blog"}],"author":[{"embeddable":true,"href":"https:\/\/www.pbs.org\/independentlens\/wp-json\/wp\/v2\/users\/13"}],"replies":[{"embeddable":true,"href":"https:\/\/www.pbs.org\/independentlens\/wp-json\/wp\/v2\/comments?post=22033"}],"version-history":[{"count":5,"href":"https:\/\/www.pbs.org\/independentlens\/wp-json\/wp\/v2\/blog\/22033\/revisions"}],"predecessor-version":[{"id":27828,"href":"https:\/\/www.pbs.org\/independentlens\/wp-json\/wp\/v2\/blog\/22033\/revisions\/27828"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.pbs.org\/independentlens\/wp-json\/wp\/v2\/media\/22038"}],"wp:attachment":[{"href":"https:\/\/www.pbs.org\/independentlens\/wp-json\/wp\/v2\/media?parent=22033"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.pbs.org\/independentlens\/wp-json\/wp\/v2\/categories?post=22033"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.pbs.org\/independentlens\/wp-json\/wp\/v2\/tags?post=22033"},{"taxonomy":"topic","embeddable":true,"href":"https:\/\/www.pbs.org\/independentlens\/wp-json\/wp\/v2\/topic?post=22033"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}