Module 4 -> Lesson 5

SPAR! Rich discussion, active listening, learn through writing

Confirm or Challenge

Police should be trusted to use facial recognition technology for the safety and betterment of all citizens.

Anchor Text(s) for this Module

Supporting Text(s)/ Resources for this Module

Lesson Overview

Today's the big day! Students have another opportunity to engage in SPAR debates, this time with a lot more schema about the SPAR protocol and the topic of FRT. It is recommended that you reconsider the schedule [presented in the slide decks] and adjust the time needed for writing counter arguments and closing statements based on your students' needs. Be sure to allocate time for the SPAR reflection. In this SPAR, students will argue in response to the following prompt: e

Nota Bene

If you are finding it difficult having 7 to 8 SPAR debates simultaneously underway in your classroom, consider do a rapid fishbowl of SPAR debates where teams SPAR in succession while other groups observe, listen, and take notes.

Objectives

Students will be able to...

  • present claims, evidence and reasoning within the SPAR format;

  • actively listen and take note of opponents' claims, evidence & reasoning;

  • draft and present counter-arguments to their opponents' argument.

Suggested Duration

45 minutes (adjust according to your students' needs)

NYS Next Generation ELA Standards

  • SL1a: Come to discussions prepared, having read and researched material under study; draw on that preparation by referring to evidence to stimulate a thoughtful, well-reasoned exchange of ideas.

  • SL4: Present claims, findings, and supporting evidence clearly, concisely, and logically; organization, development, substance, and style are appropriate to task, purpose, and audience.

  • WHST1: Write arguments based on discipline-specific content.

  • W1c: Use precise language and content-specific vocabulary to express the appropriate complexity of the topic.

NYS Computer Science & Digital Fluency Standards

  • 9-12.IC.5 Describe ways that complex computer systems can be designed for inclusivity and to mitigate unintended consequences.

  • 9-12.IC.3 Debate issues of ethics related to real world computing technologies.

  • 9-12.IC.1 Evaluate the impact of computing technologies on equity, access, and influence in a global society.

Vocabulary

  • claim (noun): a statement that something is true although it has not been proved and other people may not agree with or believe it; a person’s stated position in a debate.

  • reasoning (noun): the process of thinking about things in a logical way; in debate, you use reasoning to connect the claim and evidence to explain how/ why the evidence supports your claim.

  • evidence (noun): the facts, signs, or objects that make you believe that something is true.

  • guidelines (noun): rules or instructions that are given by an official organization telling you how to do something, especially something difficult

  • oversight (noun): the state of being in charge of someone or something

  • false negative: when the face recognition system fails to match a person’s face to an image that is, in fact, contained in a database

  • false positive: when the face recognition system does match a person’s face to an image in a database, but that match is actually incorrect

  • authority bias: tendency to trust / follow influence of a leader/ person in position of authority

  • confirmation bias: the conscious or subconscious tendency for people to seek out information that confirms their pre-existing viewpoints, and to ignore information that goes against them (regardless of whether that information is true or false); example: only paying attention to news stories that confirm your opinion and ignoring or dismissing information that challenges your position.

  • automation bias: to trust information provided by a machine/ computer and ignore information that contradicts it; automation bias is one type of cognitive bias. example: trusting the output of FRT when it produces a “match” without verifying that match by other means.

  • cognitive bias: a thought process caused by the tendency of the human brain to simplify information processing through a filter of personal preferences and opinions; examples: confirmation bias; automation bias; authority bias; gender bias; group attribution error

  • algorithmic bias: systematic and repeatable errors in a computer system that create unfair outcomes, such as privileging one arbitrary group of users over others; example: facial recognition algorithms trained with a data set that does not reflect human diversity and therefore results in different reliability based on gender and race.

Hook

Briefly overview the format of the SPAR debate and get students set up with all the requisite materials: their prepared opening statements, the counter-argument builder handout, and the reflection handout. Remind students that you will be keeping very close track of time and that even if they are mid-sentence when the time is called, the need to stop. 😬 Similarly, if there is still time remaining and a team has finished their statement they must WAIT before proceeding.

Mini-Lesson

Remind students that you will be keeping very close track of time and that even if they are mid-sentence when the time is called, the need to stop. 😬 Similarly, if there is still time remaining and a team has finished their statement they must WAIT before proceeding. (This is much more common during the first few SPAR debates). Give students FIVE minutes to practice their opening statements and study their notes.

Activity

SPAR debates! Follow the format presented in Module 1:

Once students have completed their third SPAR debate, congratulate them and share some observations you made while circulating throughout the room.

Wrap Up

Students complete the SPAR reflection--try to give at least five minutes but preferably ten.

Last updated