Module 3 -> Lesson 2

Active listening, rich discussions, learn through writing

Essential Question:

What is algorithmic bias?

Anchor Text(s) for this Lesson

Supporting Text(s)/ Resources for this Lesson

Lesson Overview

In this lesson, students build on what they learned in Lesson 1 about training computers to see by applying these concepts in the context of facial recognition technology. Students watch a TED Talk by Dr. Joy Buolamwini, who describes the glitch she discovered in facial recognition software: it could not detect her face. Students make connections between their experience training teachable machine and the relationship between training sets used to teach machines and the reliability of the algorithms produced as a result of that training. Students finish the lesson by drafting key points to include in a brochure intended to educate their community about technical facts that must be taken into consideration when determining where and how to use FRT--or if it should be used at all.

Nota Bene

For students in need of enrichment, or for longer class periods, consider expanding the activity so that students are working on a brochure or mini-book that includes illustrations and at least two distinct claims--one technical, one related to power issues OR one chosen by student--along with evidence and reasoning that highlights for the audience why they should take a particular position on whether FRT should be used in the community for surveillance and / or policing. You can also find recommended reading for enrichment for students who want to do further research.

Objectives

Students will be able to...

  • Explain in general terms how humans teach computers to ‘see.’

  • Explain how limited data sets impact machine learning.

  • Draft a list of technical facts for an informational brochure.

Suggested Duration

45 minutes (adjust according to your students' needs)

NYS Next Generation ELA Standards

  • W1:Write arguments to support claims that analyze substantive topics or texts, using valid reasoning and relevant and sufficient evidence.

  • RH7: Integrate and evaluate visual and technical information (e.g., in research data, charts, graphs, photographs, videos or maps) with other information in print and digital texts.

  • RH9: Compare and contrast treatments of the same topic in several primary and secondary sources.

  • RST7: Translate information expressed visually or mathematically into words.

NYS Computer Science & Digital Fluency Standards

  • 9-12.IC.5 Describe ways that complex computer systems can be designed for inclusivity and to mitigate unintended consequences.

  • 9-12.DL.2: Communicate and work collaboratively with others using digital tools to support individual learning and contribute to the learning of others.

  • 9-12.IC.3 Debate issues of ethics related to real world computing technologies.

  • 9-12.IC.1 Evaluate the impact of computing technologies on equity, access, and influence in a global society.

Vocabulary

  • algorithm: a set of rules that must be followed when solving a particular problem, such as “find the face(s) in this setting”

  • facial recognition: technology that allows a computer to identify a person by their face

  • facial detection: technology that allows a computer to 'see' whether a face is in its view

  • technical claim: use technical information to make a statement about something that you will argue is true

  • technical: connected with the practical use of machinery, methods, etc. in science and industry

  • detect: to discover or notice something, especially something that is not easy to see, hear, etc.

  • artificial intelligence: the study and development of computer systems that can copy intelligent human behavior

  • machine learning: a type of artificial intelligence in which computers use huge amounts of data to learn how to do tasks rather than being programmed to do them

Hook

Remind students that in the last lesson, they learned a bit about how humans train computers to see. Provide them with the following prompts to discuss with a partner or to respond to independently in their notebooks.

  1. How do you imagine humans can train a computer to detect human faces?

  2. Imagine if your face was not detectable but other people’s faces were. Would this mostly benefit you or cause problems? Explain your reasoning.

Mini-Lesson

Remind students that in the last module, they saw how power issues arise with the use of FRT for surveillance and policing purposes. In the previous lesson, we shifted gears to focus on gathering information, so that we can make technical claims and support those claims with relevant evidence and cogent reasoning.

We also experimented with Teachable Machine to get a sense of how humans train computers to see.

Ask students: What are some of your key takeaways from that lesson?

  • How do you teach a machine to tell the difference between a bee and a three?

  • How does the number of classes and the size of your data set influence the reliability of the algorithm developed by the computer after it analyzes the training set (data you input)?

Ask students: What if the computer detects nothing at all? How might that be related to the training data? A computer is very unlikely to ‘see’ something that it has not encountered multiple times in the data set provided by the humans training that computer.

The following question is intended to prime students' brains for the upcoming TED Talk by Dr. Buolamwini: How might a limited data set impact algorithms that are trained to support facial detection and facial recognition technologies?

Inform students that when Dr. Joy Buolamwini was and MIT grad student working with facial analysis software, she discovered a serious glitch: the software didn't detect her face.

Tell students: Think about what you learned about machine learning in our last lesson and make a prediction about why the software did not detect Dr. Buolamwini’s face.

Give students the purpose for viewing prior to watching the video. Purpose for viewing: What steps must computer scientists take to ensure that computers learn to detect all faces?

Screen the video How I Fighting Bias in Algorithms then debrief with students.

  • What steps must computer scientists take to ensure that computers learn to detect all faces?

  • What is algorithmic bias?

  • What technical facts presented in this video do you think people should know when determining whether and how FRT should be used?

Activity

Now is your opportunity to consolidate your learning from last several lessons, so that you can use that knowledge to educate your community about FRT.

Draft a technical claim related to facial recognition and support that claim with evidence gleaned from the video you watched today and/ or from videos you watched yesterday as well as your experience with Teachable Machine. Use reasoning to explain how and why the claim and evidence are related–you also want to use your reasoning to help your audience understand why this is so important!

Wrap Up

What are some of the technical claims you came up with?

Where did you get stuck?

What questions do you have?

Last updated