Opinion: I study school shootings. Here’s what AI can — and can’t — do to stop them
Editor’s Note: David Riedman is the founder of the K-12 School Shooting Database, an open-source research project that documents shootings at schools back to 1966. He conducts research on gun violence in schools and has authored multiple peer-reviewed articles on homeland security policy, critical infrastructure protection and emergency management. Formerly, he served as a firefighter and emergency medical technician for 18 years in Maryland, where he reached the rank of captain. He briefly worked for an AI weapons detection company, ZeroEyes. The views expressed in this commentary are his own. Read more opinion at CNN.
Since the start of the 2023-24 school year in August, a gun has been fired on a K-12 campus at least 300 times. Over the past decade, the number of school shootings has increased tenfold from 34 in 2013 to 348 in 2023.
This rapidly escalating pattern of gun violence on campus has left parents, teachers and school officials desperate for any solution.
Many schools have been purchasing new artificial intelligence and technology products that are marketed to districts looking for help to detect a potential gunman on campus. This intense pressure on school officials to do something to protect students has transformed school security from a niche field to a multibillion-dollar industry.
Public schools often lack funds, equipment and personnel, and AI offers incredible potential to automatically detect threats faster than any human. There is not enough time, money and person-power to watch every security camera and look inside every pocket of each student’s backpack. When people can’t get this job done, using AI technology can be a powerful proposition.
I’ve collected data on more than 2,700 school shootings since 1966 plus security issues such as swatting, online threats, averted plots, near misses, stabbings and students caught with guns.
Based on my research, there’s no simple solution to this array of threats because school security is uniquely complex. Unlike airport terminals and government buildings, schools are large public campuses that are hubs for community activities beyond traditional school hours.
A weeknight at a high school might have varsity basketball, drama club, adult English-language classes and a church group renting the cafeteria — with potential security gaps amid this flurry of activity.
Two common applications of AI right now are computer vision and pattern analysis with large language models. These provide the opportunity to monitor a campus in ways that humans can’t.
AI is being used at schools to interpret the signals from metal detectors, classify objects visible on CCTV, identify the sound of gunshots, monitor doors and gates, search social media for threats, look for red flags in student records and recognize students’ faces to identify intruders.
This AI software functions best when it’s addressing well-understood and clearly defined problems like identifying a weapon or an intruder. If these systems work correctly, when a security camera sees a stranger holding a gun, AI software flags the face of an unauthorized adult and object classification identifies the gun as a weapon. These two autonomous processes trigger another set of AI systems to lock the doors, call 911 and send text-message alerts.
What AI can and can’t do
With school security, we want certainty. Is the person on CCTV holding a gun? We expect a “yes” or “no” answer. The problem is AI models provide “maybe” answers. This is because AI models are based on probability.
For AI classifying images as a weapon, an algorithm compares each new image to the patterns of weapons in training data. AI doesn’t know what a gun is because a computer program doesn’t know what anything is. When an AI model is shown millions of pictures of guns, the model will try to find that shape and pattern in future images. It’s up to the software vendor to decide the probability threshold between a gun and not a gun.
This is a messy process. An umbrella could score 90% while a handgun that’s partially obscured by clothing might only be 60%. Do you want to avoid a false alarm for every umbrella, or get an alert for every handgun?
AI software interpreted this CCTV image as a gun at Brazoswood High School in Clute, Texas, sending the school into lockdown and police racing to campus. The dark spot is a shadow on a drainage ditch that is lined up with a person walking.
Cameras generate poor-quality images in low light, bright light, rain, snow and fog. Should a school be using AI to make life-or-death decisions based on a dark, grainy image that an algorithm can’t accurately process? A large transit system in Pennsylvania canceled its contract with the same vendor used by Brazoswood because it said the software couldn’t reliably spot guns.
Schools need to understand the limits of what an AI system can — and cannot — do.
With cameras or hardware, AI isn’t magic. Adding AI software to a magnetometer doesn’t change the physics of a gun and metal water bottle producing the same signal. This is why an AI screening vendor is being investigated by the FCC and SEC for allegedly inaccurate marketing claims made to schools across the country.
A costly endeavor
The biggest expense with school security is the physical equipment (cameras, doors, scanners) and the staff who operate it. AI software on an old security camera generates revenue for the security-solutions company without the vendor or school needing to spend money on equipment. Saving money is great until a shadow causes a police response for what AI thinks is an active shooter.
Instead of schools choosing to test or acquire the best solutions based on merit, vendors lobby to structure local, state and federal government funding to create a shortlist of specific products that schools are compelled to buy. During a period of rapid AI innovations, schools should be able to select the best product available instead of being forced to contract with one company.
Schools are unique environments and need security solutions — both hardware and software — that are designed for schools from the start. This requires companies to analyze and understand the characteristics of gun violence on campus before developing an AI product. For example, a scanner that is created for sports venues that only allows fans to carry in a limited number of items is not going to function well in a school where kids carry backpacks, binders, pens, tablets, cell phones and metal water bottles each day.
For AI technology to be useful and successful at schools, companies need to address campuses’ greatest security challenges. In my studies of thousands of shootings, the most common situation that I see is a teenager who habitually carries a gun in their backpack, and they fire shots during a fight. Manually searching every student and bag is not a viable solution because students end up spending hours in security lines instead of classrooms. Searching bags is not an easy task and shootings still happen inside schools with metal detectors.
Neither image classification from CCTV nor retrofitted metal detectors address the systemic problem of teens freely carrying a gun at school each day. Solving this challenge requires better sensors with more advanced AI than any product available today.
Schools can’t be fortresses
Unfortunately, school security is currently drawing from the past instead of imagining a better future. Medieval fortresses were a failed experiment that ended up concentrating risk rather than reducing it. We are fortifying school buildings without realizing why European empires stopped building castles centuries ago.
The next wave of AI security technology has the potential to make schools safer with open campuses that have invisible layers of frictionless security. When something does go wrong, open spaces provide the most opportunities to seek cover. Children should never be trapped inside a classroom again like they were by the gunman who killed 19 children and two teachers in Uvalde, Texas, in 2022.
Schools sit at the brink between a troubled past and a safer future. AI can either inhibit or enable how we get there. The choice is ours.
For more CNN news and newsletters create an account at CNN.com