In a recent article, the ACLU attacked a New York school’s planned implementation of a $4M face recognition system. Caveat: I am a big fan of the ACLU. I’m glad we have an aggressive organization fighting for liberty. And if they are not making some chunk of the population really mad every day, they are not doing their job. I also don’t have an opinion about the use of face recognition in schools, except to say that any implementation of face recognition needs to be done with full knowledge of its limitations, especially if in involves law enforcement.
In this particular case, I think the ACLU is absolutely on the mark to question this system. The appropriate role of mass surveillance tools has not been determined and it’s an important issue. However, their article makes a few significant errors/exaggerations that weaken their case. There is plenty to worry about without having to make things up.
The original source of this recent controversy was an article published by the Buffalo news describing the project. This article raises a huge number of red flags, too many to go into here. But the basic idea of the system is quite common. In essence it is a conventional video surveillance system with a face recognition add-on. You upload a watchlist of bad guys (sex offenders, whatever) and it will scan all the incoming live video footage looking for a match. Alternatively, you can upload a picture and it will scan the past 60 days looking for a match. They mention the capability to monitor all of the places a particular student has gone in the past 60 days, but they describe that as a special case, not normal operation.
These types of systems are being heavily promoted right now, and the ones that I have seen have very poor performance compared to manufacturer claims. IPVM, an independent website analyzing video surveillance systems has many scathing critiques of similar systems. There are three main challenges here:
the faces are completely unconstrained “in the wild” as students go about their day
the cameras are always going to be mounted high (to prevent vandalism) which makes accurate recognition very challenging
they have to operate in a wide variety of ambient lighting
It is not that these system can’t recognize people – they absolutely can. It is more that there are so many uncontrolled variables that they will not be able to reliably recognize people. To be clear, I know nothing about this particular system, but when they make an outrageous claim like “99.97% accuracy” they really should be able to back that up with data.
So, great job ACLU! All other considerations aside, this is a system that needs more scrutiny and rigorous testing. But unfortunately, the ACLU needs to throw some other stuff into the mix to make their case stronger, and that is where my objections lie. I’m going to address each objection they have:
Loss of Student Privacy – The proposed system is intended to house a watchlist of adults, not to track student’s movements. The vendor doesn’t claim it can do that either. And I would be extremely skeptical if they claimed it could do this. Tracking hundreds/thousands of unconstrained faces moving through a school is not possible with any current technology I am aware of.
False Matches – The ACLU quotes their own experiment linking a criminal database to Congress. This experiment was fundamentally flawed because they set the match threshold too low. Amazon has debunked this experiment quite well with their own well designed experiment. But false matches certainly can be a concern, especially with a watchlist system. But in this case it would inconvenience adults, not kids.
Discrimination – The ACLU (quoting their own flawed experiment again) claims that some face systems are more likely to falsely match people of color. In other articles, they have claimed the opposite, that face recognition systems are less likely to match people of color. Both happen, but you can’t just selectively report one and not the other. Based upon my experience, a system would be more likely to not match a person of color, especially in dim lighting.
Ineffectiveness – The claim is that this type of system is unlikely to prevent the next school shooting. No argument there. Even if it were perfect, it is unlikely to have the right person in the database, and if it it would not provide enough time for authorities to react.
Fewer Graduates – I think the claim here is that normalization of surveillance technology will alienate kids from school and make them less likely to graduate. I do think we have some societal issues with the increased militarization of schools (not relevant to this discussion), but the proposed system is just a bunch of surveillance cameras similar to what already exist in the school and every other public place. The system won’t be tracking student movements. I don’t see how this can affect graduation rates.
Overall, I would be happy to give the ACLU an A for finding this target to attack. It is expensive and very likely to ineffective in real-world matching, and almost certainly not useful in a school shooting. But I would have to give them a C (grading on a curve) for bringing in a wider number of irrelevant factors to this issue, weakening their argument considerably.