Privacy Advocates Demand Ban on Facial Recognition in Schools in Response to Damning Study on the Technology
I have long been concerned with the use of surveillance videocameras in schools for fear that they would ultimately be linked with emerging facial recognition technology to create an even more oppressive environment in our public schools. According to this article by Andrea Germanos from Common Dreams that day has just arrived. And, as researchers at the University of Michigan’s Ford School of Science, Technology, and Public Policy Program (STPP) write in their recent report “Cameras in the Classroom” indicates, it is as bad as I feared:
“Schools have also begun to use [FR] to track students and visitors for a range of uses, from automating attendance to school security,” the researchers wrote, though they noted that the technology’s use in schools is “not yet widespread.”
But, the authors added, there’s good reason to stop its spread:
[O]ur analysis reveals that FR will likely have five types of implications: exacerbating racism, normalizing surveillance and eroding privacy, narrowing the definition of the “acceptable” student, commodifying data, and institutionalizing inaccuracy. Because FR is automated, it will extend these effects to more students than any manual system could.
FR “is likely to mimic the impacts of school resource officers (SROs), stop-and-frisk policies, and airport security,” all of which “purport to be objective and neutral systems, but in practice they reflect the structural and systemic biases of the societies around them,” the study says.
And most white Americans support “…school resource officers (SROs), stop-and-frisk policies, and airport security” because they know that SROs won’t profile their children, their children will not be subjected to stop-and-frisk policies, and their children will never be profiled by TSA at airport security. This is yet another example of how white privilege is an ocean white parents and children swim in without realizing that they are receiving a benefit. The use of FR will add yet another layer of privilege in the name of safety. As the “Cameras in the Classroom” report note:
“All of these practices have had racist outcomes due to the users of the systems disproportionately targeting people of color,” the researchers wrote.
We have more than enough “normalization of surveillance” to ensure our safety. Adding FR will further erode the liberty we have sacrificed to algorithms. FR should not be used in schools. Period.