Home > Uncategorized > In Lockport, Fear Wins Out Over Privacy and Facial Recognition is In Place

In Lockport, Fear Wins Out Over Privacy and Facial Recognition is In Place

February 10, 2020

I posted earlier about the Lockport (NY) School District’s decision to spend millions of dollars to implement facial recognition software designed by Aegis in its schools, a decision made two years ago that received pushback from several community members, the State Department, and the ACLU. I read this weekend in Davey Alba’s SFGate article that despite these appeals and after addressing the State’s concerns, the district is moving forward with the plan. The ACLU’s spokesperson did an eloquent job of explaining the negative consequences:

“Subjecting 5-year-olds to this technology will not make anyone safer, and we can’t allow invasive surveillance to become the norm in our public spaces,” said Stefanie Coyle, education counsel for the New York Civil Liberties Union. “Reminding people of their greatest fears is a disappointing tactic, meant to distract from the fact that this product is discriminatory, unethical and not secure.”

And make no mistake: fear WAS the selling point and technology was the clear antidote:

Robert LiPuma, the Lockport City School District’s director of technology, said he believed that if the technology had been in place at Marjory Stoneman Douglas High School in Parkland, Fla., the deadly 2018 attack there may never have happened.

“You had an expelled student that would have been put into the system, because they were not supposed to be on school grounds,” LiPuma said. “They snuck in through an open door. The minute they snuck in, the system would have identified that person.”

…When the system is on, LiPuma said, the software looks at the faces captured by the hundreds of cameras and calculates whether those faces match a “persons of interest” list made by school administrators.

That list includes sex offenders in the area, people prohibited from seeing students by restraining orders, former employees who are barred from visiting the schools and others deemed “credible threats” by law enforcement.

If the software detects a person on the list, the Aegis system sends an alert to one of 14 rotating part- and full-time security personnel hired by Lockport, LiPuma said. The human monitor then looks at a picture of the person in the database to “confirm” or “reject” a match with the person on the camera.

If the operator rejects the match, the alert is dismissed. If the match is confirmed, another alert goes out to a handful of district administrators, who decide what action to take.

The technology will also scan for guns. The chief of the Lockport Police Department, Steven Abbott, said that if a human monitor confirmed a gun that Aegis had detected, an alert would automatically go to both administrators and the Police Department.

So now, the citizens of Lockport can presumably rest easy… that is unless the Aegis software mistakes one of their children as a “person of interest” made by the school administrators… or they resemble anyone on a secret list put together by the administrators and police… or they are a person of color.

In Lockport, black students are disproportionately disciplined. In the 2015-16 school year, 25% of suspended students in the district were black even though enrollment was only 12% black, according to data from the federal Department of Education.

LiPuma, the director of technology, said he believed that Lockport’s system was accurate. He also said he, as well as some other school officials, would like to add suspended students to the watch list in the future, despite the State Education Department’s recent directive that Lockport make it clear in its policy that it is “never” to use the system “to create or maintain student data.” Most school shootings in the past decade, LiPuma said, were carried out by students.

“The frustration for me as a technology person is we have the potential” to prevent a school shooting, he said. “If something happens, I’m not going to feel any better about that, but it wasn’t my decision. That’s on State Ed.”

Jason Nance, a law professor at the University of Florida who focuses on education law and policy, warned that listing students as “persons of interest” could have unintended consequences.

“If suspended students are put on the watch list, they are going to be scrutinized more heavily,” he said, which could lead to a higher likelihood that they could enter into the criminal justice system.

Jayde McDonald, a political science major at Buffalo State College, grew up as one of the few black students in Lockport public schools. She said she thought it was too risky for the school to install a facial recognition system that could automatically call the police.

“Since the percentages for the false matches are so high, this can lead to very dangerous and completely avoidable situations,” McDonald said.

So an unproven technology that has the potential to inaccurately profile potential offenders is being sold to a school district based on the presume put forth by a district technologist who claims that shootings in Florida would have been prevented with this new product in place. What’s wrong with this picture? Why are we allowing fear to dominate the lives of our school children? How can this be reversed?

<span>%d</span> bloggers like this: