You knew this would happen. Security corporations are selling the latest thing to schools worried about shootings.

Lockport, New York, bought a facial recognition system that is programmed to identify the students and teachers who belong and to identify the criminals and sexual predators who are in its data system.

Next school year, Lockport schools will have in place the kind of security software used at airports, casinos and sensitive government installations.

Facial recognition and tracking software will add an unprecedented level of security at the schools. District officials have decided locked entrance doors, bullet-proof glass and sign-in registers at the front desk are not enough.

“We always have to be on our guard. We can’t let our guard down,” Lockport Superintendent Michelle T. Bradley said. “That’s the world that we’re living in. Times have changed. For the Board of Education and the Lockport City School District, this is the No. 1 priority: school security.”

Depew schools want to install the same system, as soon as a state funding request is approved.

“When it comes to safety and security, we want to have the best possible,” Depew Superintendent Jeffrey R. Rabey said. “From what I’ve seen, there’s no other like it.”

Studies have shown the technology doesn’t always work well, but the consultant to the district says a Canadian company has worked out the bugs that plagued earlier facial recognition software.

“Lockport will be the first school district in the world with this technology deployed,” said Tony Olivo, an Orchard Park security consultant who helped develop the system.

The software is used by “Scotland Yard, Interpol, the Paris police and the French Ministry of Defense,” Olivo said. “There are a lot of facial recognition systems out there. There is nothing in the world that can do what this technology does.”

Lockport will spend $1.4 million of the state’s money on the Aegis system, from SN Technologies of Ganonoque, Ont., in all 10 district buildings this summer. It’s part of a $2.75 million security system that includes 300 digital video cameras.

Lockport played a role in the system’s development. Olivo said in the summer of 2015, the software creators used Lockport High School for test videos featuring various types of guns.

Rabey said that because Depew has three buildings on one campus, rather than 10 different locations as Lockport has, Depew would need only 75 cameras, and the cost would be $188,000.

“We believe it’s innovative. We believe it’s an investment. And it’s meant to intercept unwanted people and items,” Bradley said.

But Jim Shultz, a Lockport parent, calls the upgrade a waste of money. And it won’t prevent a school shooting. He said the district would at best gain a few seconds in response time if a crazed killer rushed into a Lockport school with an AR-15.

The new system does not have X-ray.

It can’t detect metal, concealed weapons or explosives.

What it can do is alert officials if someone whose photo has been programmed into the system – a registered sex offender, wanted criminal, non-custodial parent, expelled student or disgruntled former employee – comes into range of one of the 300 high-resolution digital cameras.

“A school is now a target, unfortunately,” said Robert L. LiPuma, Lockport’s technology director. “Based on recommendations, things we saw, drills we did, pilots we did, we assessed all of that and we thought this was the best option, economically and responsibly, for the safety of our community.”

If a known bad guy is spotted, or a gun or other weapon is visible to the system’s cameras, the software could flash an alarm to any district officials connected to it, and also to police.

In the last five years, all of the major school shootings in the U.S. have been carried out by current or recent students of the school in question.

At the Sandy Hook massacre in Newtown, Conn., in 2012, the shooter was a mentally ill 20-year-old who shot his way through a locked entrance door before killing 20 children and employees. Police arrived five minutes after the killer entered.

Failed recognition

Studies have shown that commercially available facial recognition software simply doesn’t work very well.

Researchers have discovered that it works well only on white men and is much less effective on people of color, women and children.

In one of the most drastic examples, facial recognition software was tested last June on the crowd at a championship soccer game in Cardiff, Wales. The system triggered 2,470 alerts for matches with a police database – but 92 percent of the “matches” turned out to be false. The police blamed the poor quality of the photos in the database.

If the shooter is a current student, the system will not identify him. It would have been no help in Columbine or Santa Fe, Texas.

The system will have the capacity to track students’ movements in the building.

A good way to prepare for life in a surveillance state.