Unit 1: Who Owns Your Face?
  • 🎭Unit Overview: Who Owns Your Face?
  • Module 0 -> Overview
    • Tips for Launching SPAR
  • Module 1: What claims do opponents and proponents of FRT make?
    • Module 1 -> Overview
      • Module 1 -> Lesson 1
      • Module 1 -> Lesson 2
      • Module 1 -> Lesson 3
      • Module 1 -> Lesson 4
      • Module 1 -> Lesson 5
  • Module 2: Who has the right to use FRT and for what purposes?
    • Module 2 -> Overview
      • Module 2 -> Lesson 1
      • Module 2 -> Lesson 2
      • Module 2 -> Lesson 3
      • Module 2 -> Lesson 4
      • Module 2 -> Lesson 5
      • Module 2 -> Lesson 6
  • Module 3: How do computers 'see'? How does FRT work?
    • Module 3 -> Overview
      • Module 3 -> Lesson 1
      • Module 3 -> Lesson 2
      • Module 3 -> Lesson 3
      • Module 3 -> Lesson 4
  • Module 4: To what extent does bias influence reliability & impact of FRT??
    • Module 4 -> Overview
      • Module 4 -> Lesson 1
      • Module 4 -> Lesson 2
      • Module 4 -> Lesson 3
      • Module 4 -> Lesson 4
      • Module 4 -> Lesson 5
  • 🤓Module 5: What role should government play in the public and private use of FRT?
    • Module 5 -> Overview
      • 🤓Module 5 -> Lesson 1
      • 🤓Module 5 -> Lesson 2
      • 🤓Module 5 -> Lesson 3
      • 🤓Module 5 -> Lesson 4
  • 🤓Module 6: Choose Your Own Adventure
    • Overview, Recommendations & Resources
      • 🎉EXTRA: Creative Resistance
  • End of Unit Project
    • Project Overview & Resources
Powered by GitBook
On this page
  • Module Overview
  • Anchor Text(s) for this Module
  • Supporting Text(s)/ Resources for this Module
  • NYS Next Generation ELA Standards
  • NYS Computer Science & Digital Fluency Standards
  • Vocabulary
  1. Module 4: To what extent does bias influence reliability & impact of FRT??

Module 4 -> Overview

PreviousModule 3 -> Lesson 4NextModule 4 -> Lesson 1

Last updated 1 year ago

Essential Question

To what extent does bias influence reliability and impact of facial recognition technology?

Module Overview

In this module, students continue deepening their understanding of FRT with a particular focus on how this technology is being used in NYC and the related risks, particularly those arising from various forms of bias (e.g., automation bias, confirmation bias, and algorithmic bias). Students closely read and evaluate a rich and complex text set and continue to develop claims, identify relevant evidence and engage in deeper analysis. Students closely read the NYPD FRT Impact and Use Policy and are exposed to the role of federal and local oversight and how that might be leveraged to maximize the benefits and minimize the risks associate with FRT when used for policing. Students engage in another SPAR debate at the end of this module.

Anchor Text(s) for this Module

  • , published April 11, 2021

  • [available in English, Spanish, French and Arabic], June 3, 2021 Amnesty International ()

  • by by Ángel Díaz, October 4, 2019, Brennan Center for Justice ()

Supporting Text(s)/ Resources for this Module

  • (recommended)

  • video by Risk Bites, July 25, 2020, from youtube.com

  • SPAR Debate Materials

NYS Next Generation ELA Standards

  • L4c: Consult general and specialized reference materials (e.g., dictionaries, glossaries, thesauruses) to find the pronunciation of a word or determine or clarify its precise meaning, its part-of-speech, or its etymology.

  • L6: Acquire and accurately use general academic and content-specific words and phrases, sufficient for reading, writing, speaking, and listening; demonstrate independence in applying vocabulary knowledge when considering a word or phrase important to comprehension or expression.

  • RH9: Compare and contrast treatments of the same topic in several primary and secondary sources.

  • RST1: Cite specific evidence to support analysis of scientific and technical texts, charts, diagrams, etc. attending to the precise details of the source. Understand and follow a detailed set of directions.

  • R1 Cite strong and thorough textual evidence to support analysis of what the text says explicitly/ implicitly and make logical inferences; develop questions for deeper understanding and for further exploration.

NYS Computer Science & Digital Fluency Standards

  • 9-12.DL.1: Type proficiently on a keyboard.

  • 9-12.DL.2: Communicate and work collaboratively with others using digital tools to support individual learning and contribute to the learning of others.

  • 9-12.IC.1 Evaluate the impact of computing technologies on equity, access, and influence in a global society.

  • 9-12.IC.3 Debate issues of ethics related to real world computing technologies.

  • 9-12.IC.5 Describe ways that complex computer systems can be designed for inclusivity and to mitigate unintended consequences.

Vocabulary

  • algorithmic bias: systematic and repeatable errors in a computer system that create unfair outcomes, such as privileging one arbitrary group of users over others; example: facial recognition algorithms trained with a data set that does not reflect human diversity and therefore results in different reliability based on gender and race.

  • cognitive bias: a thought process caused by the tendency of the human brain to simplify information processing through a filter of personal preferences and opinions; examples: confirmation bias; automation bias; authority bias; gender bias; group attribution error

  • confirmation bias: the conscious or subconscious tendency for people to seek out information that confirms their pre-existing viewpoints, and to ignore information that goes against them (regardless of whether that information is true or false); example: only paying attention to news stories that confirm your opinion and ignoring or dismissing information that challenges your position.

  • automation bias: to trust information provided by a machine/ computer and ignore information that contradicts it; automation bias is one type of cognitive bias. example: trusting the output of FRT when it produces a “match” without verifying that match by other means.

  • authority bias: tendency to trust / follow influence of a leader/ person in position of authority

  • false positive: when the face recognition system does match a person’s face to an image in a database, but that match is actually incorrect

  • false negative: when the face recognition system fails to match a person’s face to an image that is, in fact, contained in a database

Wrongfully Arrested Because of Flawed Facial Recognition Technology
video transcript
Excerpt from NYPD Facial Recognition: Impact & Use Policy
Full report here
Surveillance City: NYPD Can Use More than 15,000 Cameras to Track People Using Facial Recognition in Manhattan, Bronx, & Brooklyn
archived text
New York City Police Department Surveillance Technology
archived text
Atlas of Surveillance
What are the risks and ethics of FRT?
Video transcript
Evidence tracker
SPAR Debate Activity Guide