I’m investigating school shootings. Here’s what AI can and can’t do to stop them:

By | June 11, 2024

Editor’s Note: David Riedman is the founder. K-12 School Shooting Database, an open-source research project documenting school shootings since 1966. He conducts research on gun violence in schools and is the author of numerous peer-reviewed articles on homeland security policy, critical infrastructure protection, and emergency management. He previously served as a firefighter and emergency medical technician in Maryland for 18 years, where he attained the rank of captain. The opinions expressed in this comment are his own. To read more views on CNN.

Since the start of the 2023-24 school year in August, there have been at least 300 shootings on the K-12 campus. The number of school shootings has increased tenfold over the past decade, from 34 in 2013 to 348 in 2023.

David Riedman - David Riedman

David Riedman – David Riedman

The skyrocketing trend of gun violence on campus has left parents, teachers, and school officials helpless for any solution.

Many schools are purchasing new artificial intelligence and technology products marketed to areas looking for help identifying a potential gunman on campus. This intense pressure on school officials to do something to protect students has transformed school security from a niche field into a multibillion-dollar industry.

Public schools often lack funding, equipment, and staff, and AI offers incredible potential to automatically detect threats faster than any human could. There is not enough time, money or manpower to monitor every security camera or look in every pocket of every student’s backpack. When humans cannot do this job, using artificial intelligence technology can be a strong recommendation.

In addition to more than 2,700 school shootings since 1966, I collected data on security issues such as assault, online threats, thwarted plots, near misses, stabbings, and students caught with guns.

According to my research, there is no simple solution to these types of threats because school security is uniquely complex. Unlike airport terminals and government buildings, schools are large public campuses that are centers of community activities beyond traditional school hours.

During the weekdays, a high school might have a basketball team, a drama club, adult English classes, and a church group that rents the cafeteria; Amidst this activity, there may be potential security gaps.

Two common applications of AI currently are computer vision and pattern analysis with large language models. These provide the opportunity to view a campus in ways humans cannot.

In this image from surveillance video, law enforcement officers take the stage in the hallway after the gunman entered Robb Elementary School in Uvalde, Texas, on May 24, 2022.  - Texas House Investigative Committee/ReutersIn this image from surveillance video, law enforcement officers take the stage in the hallway after the gunman entered Robb Elementary School in Uvalde, Texas, on May 24, 2022.  - Texas House Investigative Committee/Reuters

In this image from surveillance video, law enforcement officers take the stage in the hallway after the gunman entered Robb Elementary School in Uvalde, Texas, on May 24, 2022. – Texas House Investigative Committee/Reuters

Artificial intelligence is being used in schools to interpret signals from metal detectors, classify objects visible on CCTV, identify gunshots, monitor doors and entrances, look for threats on social media, look for signs of danger in student records, and recognize students’ faces. Identify intruders.

This AI software works best when it addresses well-understood and clearly defined problems, such as identifying a weapon or intruder. If these systems work properly, when a security camera sees a stranger holding a gun, AI software flags the face of an unauthorized adult and object classification identifies the gun as a weapon. These two autonomous processes trigger another AI system to lock doors, call 911, and send text message alerts.

What artificial intelligence can and cannot do

We want certainty about school safety. Is the person on CCTV holding a gun? We are waiting for a “yes” or “no” answer. The problem is that AI models provide “maybe” answers. This is because artificial intelligence models are probabilistic.

For the AI ​​to classify images as weapons, an algorithm compares each new image to weapon models in the training data. The AI ​​doesn’t know what a gun is because the computer program doesn’t know what anything is. When an AI model is shown millions of images of weapons, the model will try to find that shape and pattern in future images. It is up to the software vendor to decide the probability threshold between gun and not gun.

This is a complex process. An umbrella may score 90%, while a gun partially concealed by clothing may only score 60%. Do you want to avoid false alarms for every umbrella or get alerts for every gun?

AI software interpreted this CCTV footage as a gun at Brazoswood High School in Clute, Texas, putting the school on lockdown and causing police to rush towards the campus. A dark spot is a shadow on a drainage ditch lined with a person walking.

Cameras produce poor quality images in low light, bright light, rain, snow and fog. Should a school use AI to make life-or-death decisions based on a dark, grainy image that an algorithm can’t process accurately? A major transportation system in Pennsylvania canceled its contract with the vendor Brazoswood used because it said the software could not reliably detect weapons.

Schools need to understand the limits of what an AI system can and cannot do.

AI is not magic when it comes to cameras or hardware. Adding AI software to a magnetometer doesn’t change the physics of the gun and metal water bottle producing the same signal. That’s why an AI scanning vendor is being investigated by the FCC and SEC for false marketing allegations to schools across the country.

A costly endeavor

The biggest expense of school security is physical equipment (cameras, doors, scanners) and the personnel who operate them. AI software in an old security camera generates revenue for a security solutions company without the vendor or school having to spend money on equipment. Saving money is great until you provoke a shadow police response to what the AI ​​thinks is an active shooter.

Rather than schools choosing to test or acquire the best solutions based on merit, vendors lobby to structure local, state, and federal government funding to create a short list of specific products that schools must purchase. In a time of rapid AI innovation, schools should be able to choose the best product available rather than having to contract with a single company.

Schools are unique environments and need both hardware and software security solutions designed for schools from the start. This requires companies to analyze and understand the characteristics of gun violence on campus before developing an AI product. For example, a scanner designed for gyms that only allows fans to carry a limited number of items won’t work well in a school where kids carry backpacks, folders, pens, tablets, cell phones, and metal water bottles. day.

For AI technology to be useful and successful in schools, companies need to solve the biggest security problems on campuses. When I examine thousands of armed attacks, the most common situation I encounter is a young person who has a habit of carrying a gun in his backpack and opens fire during a fight. Manually searching each student and their bag is not a viable solution because students would have to spend hours in security lines instead of in classrooms. Searching bags is no easy task, and shootings with metal detectors still occur inside schools.

Neither CCTV image classification nor retrofitted metal detectors address the systemic problem of young people freely carrying guns at school every day. Solving this challenge requires better sensors with more advanced artificial intelligence than any product available today.

Schools cannot be castles

Unfortunately, school safety is currently taking advantage of the past rather than dreaming of a better future. Medieval castles were a failed experiment that intensified rather than reduced risk. We fortify school buildings without understanding why European empires stopped building castles centuries ago.

The next wave of AI security technology has the potential to make schools safer with open campuses with frictionless layers of security. When something goes wrong, open areas provide the most opportunities to hide. Children should never again be trapped in the classroom like the gunman who killed 19 children and two teachers in Uvalde, Texas, in 2022.

Schools stand between a troubled past and a safer future. Artificial intelligence can prevent or enable us to get there. The choice is ours.

For more CNN news and newsletters, create an account at CNN.com

Leave a Reply

Your email address will not be published. Required fields are marked *