24.1 C
New York
Friday, June 21, 2024

School Surveillance Will Never Protect Kids From Shootings

If we are to believe the purveyors of school surveillance systems, K-12 schools will soon operate in a manner akin to some agglomeration of Minority Report, Person of Interest, and Robocop. “Military grade” systems would slurp up student data, picking up on the mere hint of harmful ideations, and dispatch officers before the would-be perpetrators could carry out their vile acts. In the unlikely event that someone were able to evade the predictive systems, they would inevitably be stopped by next-generation weapon-detection systems and biometric sensors that interpret the gait or tone of a person, warning authorities of impending danger. The final layer might be the most technologically advanced—some form of drone or maybe even a robot dog, which would be able to disarm, distract, or disable the dangerous individual before any real damage is done. If we invest in these systems, the line of thought goes, our children will finally be safe.

Not only is this not our present, it will never be our future—no matter how expansive and intricate surveillance systems become.

In the past several years, a host of companies have sprouted up, all promising a variety of technological interventions that will curtail or even eliminate the risk of school shootings. The proposed “solutions” range from tools that use machine learning and human monitoring to predict violent behavior, to artificial intelligence paired with cameras that determine the intent of individuals via their body language, to microphones that identify potential for violence based on a tone of voice. Many of them use the specter of dead children to hawk their technology. Surveillance company AnyVision, for instance, uses images of the Parkland and Sandy Hook shootings in presentations pitching its facial- and firearm-recognition technology. Immediately after the Uvalde shooting last month, the company Axon announced plans for a taser-equipped drone as a means of dealing with school shooters. (The company later put the plan on pause, after members of its ethics board resigned.) The list goes on, and each company would have us believe that it alone holds the solution to this problem.

The failure here is not only in the systems themselves (Uvalde, for one, seemed to have at least one of these “security measures” in place), but in the way people conceive of them. Much like policing itself, every failure of a surveillance or security system most typically results in people calling for more extensive surveillance. If a danger is not predicted and prevented, companies often cite the need for more data to address the gaps in their systems—and governments and schools often buy into it. In New York, despite the many failures of surveillance mechanisms to prevent (or even capture) the recent subway shooter, the mayor of the city has decided to double down on the need for even more surveillance technology. Meanwhile, the city’s schools are reportedly ignoring the moratorium on facial recognition technology. The New York Times reports that US schools spent $3.1 billion on security products and services in 2021 alone. And Congress’ recent gun legislation includes another $300 million for increasing school security.

But at their root, what many of these predictive systems promise is a measure of certainty in situations about which there can be none. Tech companies consistently pitch the notion of complete data, and therefore perfect systems, as something that is just over the next ridge—an environment where we are so completely surveilled that any and all antisocial behavior can be predicted and thus violence can be prevented. But a comprehensive data set of ongoing human behavior is like the horizon: It can be conceptualized but never actually reached.

Currently, companies engage in a variety of bizarre techniques to train these systems: Some stage mock attacks; others use action movies like John Wick, hardly good indicators of real life. At some point, macabre as it sounds, it’s conceivable that these companies would train their systems on data from real-world shootings. Yet, even if footage from real incidents did become available (and in the large quantities these systems require), the models would still fail to accurately predict the next tragedy based on previous ones. Uvalde was different from Parkland, which was different from Sandy Hook, which was different from Columbine.

Technologies that offer predictions about intent or motivations are making a statistical bet on the probability of a given future based on what will always be incomplete and contextless data, no matter its source. The basic assumption when using a machine-learning model is that there is a pattern to be identified; in this case, that there’s some “normal” behavior that shooters exhibit at the scene of the crime. But finding such a pattern is unlikely. This is especially true given the near-continual shifts in the lexicon and practices of teens. Arguably more than many other segments of the population, young people are shifting the way they speak, dress, write, and present themselves—often explicitly to avoid and evade the watchful eye of adults. Developing a consistently accurate model of that behavior is near impossible.

Most PopularBusinessThe End of Airbnb in New York

Amanda Hoover

BusinessThis Is the True Scale of New York’s Airbnb Apocalypse

Amanda Hoover

CultureStarfield Will Be the Meme Game for Decades to Come

Will Bedingfield

GearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

Not only are these technologies incapable of preventing our worst nightmares, their presence is actively moving us toward a dystopian one. If society were to deploy every surveillance and analytical tool available, schools would be hardened to a point where even the most anodyne signs of resistance or nonconformity on the part of young people would be flagged as potentially dangerous—surely an ongoing disaster for the physical, social, and emotional well-being of children, for whom testing boundaries is an essential element of figuring out both themselves and the world they live in. This applies as well to the proposal for more hardware. It’s possible to envision schools as a site where drones and robots are ready to launch into action, such that they come to resemble some combination of a penitentiary and an Amazon warehouse. Worse yet, this hyper-surveilled future is likely to significantly increase the violence visited upon Black students, trans students, and now, given the overturning of Roe, students seeking information on sexual health. All without bringing us any closer to the intended goal of eliminating shootings.

There’s a long-standing maxim among scholars and activists who study the history of technology: Innovations by themselves will never solve social problems. The school shooting epidemic is a confluence of many issues, none of which as a society we will “tech” our way out of. The common refrain is that these attempts are “better than nothing.”  Rick Smith, the CEO of Axon who briefly proposed the taser drones, told Motherboard that his plan was in fact motivated by the gridlock in Washington DC.

In one sense, it is true that doing absolutely nothing may be worse than what we have now. But this artificial dichotomy obscures other options—such as making it harder to obtain weapons capable of inflicting incalculable damage in a matter of seconds—that many countries have already done. “Better than nothing” is a set of practices that arise at the expense of children. It’s a half measure because as a society we are unwilling to do what actually works.

Yet, the attempts to offer up constant monitoring and pervasive surveillance as solutions are perhaps worse than nothing—they enrich tech companies selling us “solutions” in the same vein as bulletproof backpacks and chalkboards, while also forestalling the possibility of more proven interventions. These actions appear to offer solutions but hide us from the truth that we are consistently failing at one of the most basic functions of a society—protecting the youngest and most vulnerable among us.

Related Articles

Latest Articles