Digital Response Drill 2.0 – Biases baked into facial recognition software

A near-by school plans to start using facial recognition software on campus. Parents at your site want to know if you are planning to follow suit and why.


Activity

1. Read the Learning 2020 article, Is Bias Baked into AI? and the HRW article, Facial Recognition Technology in US Schools Threatens Rights.

2. Discuss the following questions.

What are some examples of AI applications that have been found to have biases built in?

What does it mean when someone says that biases found in AI are cultural, not personal?

What are some of the concerns related to use of facial recognition software in schools?

Should your school or district implement use of facial recognition software as a safety measure? Why?

3. Create one of the following:

Letter to parents explaining your plans related to use of facial recognition software on campus.

Memo to staff explaining your plans related to use of facial recognition software on campus.

Press release that includes a statement about your plans related to use of facial recognition software on campus.


Additional Resources

How to Keep Human Bias Out of AI, TED Talk 

We should be alarmed by schools’ creepy plan to monitor students, The Guardian 

Tackling bias in Artificial Intelligence (and in humans) McKinsey & Co. 

What Do We Do about the Biases in AI? Harvard Business Review 

This is how AI bias really happens–and why it’s so hard to fix, MIT Technology Review 

Discriminating algorithms: 5 times AI showed prejudice, New Scientist 

How bias distorts AI (Artificial Intelligence), Forbes


Digital Response Drill 2.0 Scenarios