Fear for Profit
Facial recognition is big business. Since September the number of police agencies with access to this technology has doubled. Nearly 900 agencies across 44 states now have systems that not only increase police capacities, but interface with home security systems. One such system, Ring, is promoted as increasing neighborhood safety. Ring spokeswoman Yassi Shahmiri says, “When communities and local police work together, safer neighborhoods can become a reality.” In most cases, this new, hyper-invasive technology has never been proven to be more effective than other, more human ways of creating safety.
One mother, in a suburban home, noticed that much of the behavior caused by people using these home surveillance systems can increase tensions in communities, not decrease them. She commented in a recent article: “We’re not a neighborhood that’s unsafe. We’re also not a neighborhood where people spend a lot of time outside, interacting with each other, so we turn our Rings on and start dissecting all the children. Shouldn’t we be encouraging each other to go outside, say hello and not just get alerts that you’re walking past?”
Spreading facial recognition technologies to combined police and home use is only one new avenue of money making. The newest thrust is to target school districts and exploit the fears communities have for child safety.
While some members of the Detroit City Council are resisting extended public conversation about surveillance technologies, parents in other cities are arguing that school districts are turning “our kids into lab rats in a high tech experiment in privacy invasion.” In early February the small city of Lockport, New York turned on technology to monitor its eight schools. The operation of the new technology caps a two year fight to block it. One of the most vocal opponents, Stefanie Coyle, deputy director of the Education Policy Center for the New York Civil Liberties Union said, “Subjecting 5-year-olds to this technology will not make anyone safer, and we can’t allow invasive surveillance to become the norm in our public spaces.” She explained, “Reminding people of their greatest fears is a disappointing tactic, meant to distract from the fact that this product is discriminatory, unethical and not secure.”
Digital student monitoring is growing and it is contributing to data bases that will track, monitor, identify, misidentify, predict, and profile children. As Education Week reported in May, Florida lawmakers are planning to introduce a statewide database “that would combine individuals’ educational, criminal-justice and social-service records with their social media data, then share it all with law enforcement.”
Charlie Warzel recently wrote about this new K-12 Surveillance state explaining the Lockport School District facial recognition technology has “the capacity to go back and create a map of the movements and associations of any student or teacher.” There have been gunfire-detecting microphones installed in New Mexico schools and playgrounds that require iris scans. A recent ProPublica report explored the deployment of unreliable ‘aggression detector’ cameras in places like Queens, New York. The increase is most likely linked to the number of security and surveillance technology vendors courting school district budgets.”
These new technologies are being aggressively marketed to school systems by claiming to provide safety. What they provide is profit to companies and data to marketers. They also provide dangerous new capacities for police powers to misuse so-called “predictive” data.
The public debate that erupted last spring in Detroit around facial recognition technologies has helped educate all of us about the choices before us. The City Council has a responsibility to provide ongoing opportunities for us to discuss, learn and evaluate the direction we are being pushed by corporations who know that stoking fear is good for business.