Blink Identity - High throughput, privacy preserving identification service.

View Original

DHS Face Biometric Scans - Are They Legitimate?

NOTE: This is a reprint of an article originally published by Mary Haskett in January 2018 on a different blog. We are reposting them on the Blink Identity blog because these issues are important and we want to keep our writing on these issues in one place.

Georgetown Law recently released a report titled "Not Ready For Takeoff" (clever name), harshly critiquing the DHS' facial recognition program designed to catch people over-staying their US visas. A travel visa is "conditional authorization granted by a country to a foreigner, allowing them to enter, remain within, or to leave that country. Visas typically include limits on the duration of the foreigner's stay, territory within the country they may enter, the dates they may enter, the number of permitted visits or an individual's right to work in the country in question.”

So the key point here is that most foreigners have to apply for and receive a visa before they are allowed to enter the United States and that visa is verified when they enter the country. The visa allows them to be in the country for a specific number of days (usually). However, we currently have no method for checking to see if they leave on time, or at all. So DHS is experimenting with different ways of doing that. If you are interested in privacy, the report is definitely worth a read. The report has led to a number of rather harsh critiques such as this one.

I'm not going to address the legal issues of the program, but they do have some technical critiques that are worth talking about, especially because they deal with how to measure the accuracy of a biometric system, an area that comes up almost constantly with our interactions with customers.

We've talked about accuracy in biometrics often, so I am not going into that again here. You can read a quick refresher at Biometrics 101. The report does correctly talk about the trade-offs between these two metrics. But what they want to measure is actually quite challenging. Essentially, they want to know how often the system will accept an "active" imposter. An active imposter would know that the database is mostly people of a certain gender, ethnicity, and age, and they would be similar in order to fool the system.

No one has good measurements of how much more likely a false match is depending upon similar demographics. NIST says it is more likely, but not how much. It is very expensive to get a controlled population to measure this. But based upon my experience, the worst imaginable case would be that an imposter would have a 5% chance of getting through the system. That translates to a 95% chance of going to secondary screening, where he would almost certainly get caught. if that was a significant problem, DHS would be catching lots of imposter failures.

In terms of biometric accuracy, I think the report really just misses the point. The system is not perfect - no classification system is. The better question is to ask is this system is better than what we have now?  We have no exit system at all now, so this system has to be better, assuming there is value in measuring exits at all.