Module 4 -> Lesson 2

Close reading, active listening, collaborative discussion

Confirm or Challenge

Police should be trusted to use facial recognition technology for the safety and betterment of all citizens.

Anchor Text(s) for this Lesson

Supporting Text(s)/ Resources for this Lesson

Lesson Overview

In this lesson, students critically analyze the placement of cameras in three boroughs in NYC: Brooklyn, Bronx, and Manhattan. Students also get a visual demonstration of how much/ how far one particular NYPD camera type can see. Students will consider what they have learned about algorithmic bias, automation bias and the case study from lesson one to consider what guidelines/ policies can be put in place to prevent false arrests. Students will evaluate the NYPD guidelines for using facial recognition and consider whether these guidelines provide sufficient guardrails.

Nota Bene

The activity in today's lesson includes two short videos, close reading of heat maps and an activity-guide that asks students to measure information presented in the these materials. Students are prompted to consider learning from Lesson 1, so if you have students who have missed the prior lesson(s), you might want to plan in advance to make adjustments for those students.

Objectives

Students will be able to...

  • Closely read a variety of maps to extract specific information.

  • Evaluate and discuss the NYPD guidelines for usage of FRT.

  • BONUS: Develop suggested policies or practices to mitigate the impact of biases when FRT in used in policing.

Suggested Duration

45 minutes (adjust according to your students' needs)

NYS Next Generation ELA Standards

  • RH7: Integrate and evaluate visual and technical information (e.g., in research data, charts, graphs, photographs, videos or maps) with other information in print and digital texts.

  • RH9: Compare and contrast treatments of the same topic in several primary and secondary sources.

  • W1:Write arguments to support claims that analyze substantive topics or texts, using valid reasoning and relevant and sufficient evidence.

  • W1c: Use precise language and content-specific vocabulary to express the appropriate complexity of the topic.

NYS Computer Science & Digital Fluency Standards

  • 9-12.IC.1 Evaluate the impact of computing technologies on equity, access, and influence in a global society.

  • 9-12.IC.3 Debate issues of ethics related to real world computing technologies.

  • 9-12.IC.5 Describe ways that complex computer systems can be designed for inclusivity and to mitigate unintended consequences.

Vocabulary

  • false negative: when the face recognition system fails to match a person’s face to an image that is, in fact, contained in a database

  • false positive: when the face recognition system does match a person’s face to an image in a database, but that match is actually incorrect

  • algorithmic bias: systematic and repeatable errors in a computer system that create unfair outcomes, such as privileging one arbitrary group of users over others; example: facial recognition algorithms trained with a data set that does not reflect human diversity and therefore results in different reliability based on gender and race.

  • cognitive bias: a thought process caused by the tendency of the human brain to simplify information processing through a filter of personal preferences and opinions; examples: confirmation bias; automation bias; authority bias; gender bias; group attribution error

  • automation bias: to trust information provided by a machine/ computer and ignore information that contradicts it; automation bias is one type of cognitive bias. example: trusting the output of FRT when it produces a “match” without verifying that match by other means.

  • confirmation bias: the conscious or subconscious tendency for people to seek out information that confirms their pre-existing viewpoints, and to ignore information that goes against them (regardless of whether that information is true or false); example: only paying attention to news stories that confirm your opinion and ignoring or dismissing information that challenges your position.

  • authority bias: tendency to trust / follow influence of a leader/ person in position of authority

Hook

Present the above map and ask student to respond to the following prompts:

  • What do you notice? What do you wonder? What questions do you have?

  • According to this map, do all neighborhoods have an equal number of surveillance cameras?

  • How do you think the NYPD decides where to place surveillance cameras? Explain your reasoning.

Mini-Lesson

Project the map from the hook for a second time. Share the definition of a heat map and unpack how this definition relates to the map used in the warm up.

Project the heat map from Amnesty International's Surveillance City article and model for students how to read that map (link to map). Click between public and private cameras to change to output on the map. Explicitly teach students to read the key of the map, which explains what falls under the "public" and "private" categories; shows why Staten Island and Queens do not yet have any data; and provides a key to what different colors represent in the data. Present outline of steps close readers take while reading a map. Give students an opportunity to jot their thoughts in response to the questions in the activity guide.

Screen the video that shows the power and range of the NYPD camera and then give students time to share their thoughts verbally and in writing (activity guide). Then screen the second video, being careful to emphasize that the video is featuring a different city: South Orange, NJ. Students watch and consider the question: Why have police in South Orange, NJ chosen to NOT use FRT? Give student time to discuss and write their thoughts after watching this short video.

Activity

Present the guidelines published by the NYPD in their report: NYPD Facial Recognition: Impact and Use Policy:

"Facial recognition technology must only be used for legitimate law enforcement purposes. Authorized uses of facial recognition technology are limited to the following:

  1. To identify an individual when there is a basis to believe that such individual has committed, is committing, or is about to commit a crime;

  2. To identify an individual when there is a basis to believe that such individual is a missing person, crime victim, or witness to criminal activity;

  3. To identify a deceased person;

  4. To identify a person who is incapacitated or otherwise unable to identify themselves;

  5. To identify an individual who is under arrest and does not possess valid identification, is not forthcoming with valid identification, or who appears to be using someone else’s identification, or a false identification; or

  6. To mitigate an imminent threat to health or public safety (e.g., to thwart an active terrorism scheme or plot)."

Activity guide directions: In a previous lesson, you learned about the ways in which algorithmic bias and automation bias can lead to misidentification and false arrests. Some police departments, NYPD included, have developed their own guidelines for how they will use FRT in policing in effort to make the best use of the technology while also mitigating the potential risks.

Click here to read an excerpt from the NYPD Facial Recognition: Impact and Use Policy. As you read, keep in mind the following purpose for reading:

Purpose: How do these guidelines mitigate the potential risks of FRT while also generating benefits from using the technology?

  1. What guidelines, if any, help to prevent false arrests resulting from algorithmic and automation bias? Explain your reasoning.

  2. The NYPD policy clearly states that FRT can only be used for legitimate law enforcement purposes. What are your thoughts on these limitations? Would you add or take away from this list?

  3. BONUS: Create a training program that would provide police officers with training that would help them make the best use of FRT so they can enjoy the benefits of the technology and also reduce the risks.

Wrap Up

Elicit input from students to share their thoughts about the guidelines and any questions or ideas they would like to share.

Next class we will take a closer look at the ways in which different surveillance technologies are being used in NYC. We will also learn about how citizens can take action and petition the government to write laws that provide oversight that aims to prevent misuse and abuse of technologies.

CHALLENGE: Make a prediction:

  • Do you think most people in your neighborhood are in favor of police using FRT technology to solve crimes? Explain your reasoning.

Last updated