net.warsWriting

Risk profile – Net.Wars by Wendy Grossman

Originally published on pelicancrossing.net

So here is this week’s killer question: “Are you aware of any large-scale systems employing this protection?”

It’s a killer question because this was the answer: “No.”

Rewind. For as long as I can remember – and I first wrote about biometrics in 1999 – biometrics vendors have claimed that these systems are designed to be privacy-protecting. The reason, as I was told for a Guardian article on fingerprinting in schools in 2006, is that these systems don’t store complete biometric images. Instead, when your biometric is captured, whether that’s a fingerprint to pay for a school lunch or an iris scan for some other purpose – the system samples points in the resulting image and deploys some fancy mathematics to turn them into a “template”, a numerical value that is what the system stores. The key claim: there is no way to reverse-engineer the template to derive the original image because the template doesn’t contain enough information.

The claim sounds plausible to anyone used to one-way cryptographic hashes, or who is used to thinking about compressed photographs and music files, where no amount of effort can restore Humpty-Dumpty’s missing data. And yet.

Even at the time, some of the activists I interviewed were dubious about the claim. Even if it was true in 1999, or 2003, or 2006, they argued, it might not be true in the future. Plus, in the meantime these systems were teaching kids that it was OK to use these irreplaceable iris scans, fingerprints, and so on for essentially trivial purposes. What would the consequences be someday in the future when biometrics might become a crucial element of secure identification?

Thumbnail image for wayman-from-video.pngWell, here we are in 2017, and biometrics are more widely used, even though not as widely deployed as they might have hoped in 1999. (There are good reasons for this, as James L. Wayman explained in a 2003 interview for New Scientist: deploying these systems is much harder than anyone ever thinks. The line that has always stuck in my mind: “No one ever has what you think they’re going to have where you think they’re going to have it.” His example was the early fingerprint system he designed that was flummoxed on the first day by the completely unforeseen circumstance of a guy who had three thumbs.)

So-called “presentation attacks” – for example, using high-resolution photographs to devise a spoof dummy finger – have been widely discussed already. For this reason, such applications have a “liveness” test. But it turns out there are other attacks to be worried about.

Thumbnail image for rotated-nw-marta-gomez-barrerro-2017.jpgThis week, at the European Association for Biometrics held a symposiumon privacy, surveillance, and biometrics, I discovered that Andrew Clymer, who said in 2003 that, “Anybody who says it is secure and can’t be compromised is silly”, was precisely right. As Marta Gomez-Barrero explained, in 2013 she published a successful attack on these templates she called “hill climbing”. Essentially, this is an iterative attack. Say you have a database of stored templates for an identification system; a newly-presented image is compared with the database looking for a match. In a hill-climbing attack, you generate synthetic templates and run them through the comparator, and then apply a modification scheme to the synthetic templates until you get a match. The reconstructions Gomez-Barrero showed aren’t always perfect – the human eye may see distortions – but to the biometrics system it’s the same face. You can fix the human problem by adding some noise to the image. The same is true of iris scans (PDF), hand shapes, and so on.

Granted, someone wishing to conduct this attack has to have access to that database, but given the near-daily headlines about breaches, this is not a comforting thought.

Slightly better is the news that template protection techniques do exist; in fact, they’ve been known for ten to 15 years and are the subject of ISO standard 24745. Simply encrypting the data doesn’t help as much as you might think, because every attempted match requires the template to be decrypted. Just like reused passwords, biometric templates are vulnerable to cross-matching that allows an attacker to extract more information. Second, if the data is available on the internet – this is especially applicable to face-based systems – an attacker can test for template matches.

It was at this point that someone asked the question we began with: are these protection schemes being used in large-scale systems? And…Gomez-Barrerra said: no. Assuming she’s right, this is – again – one of those situations where no matter how carefully we behave we are the mercy of decisions outside our control that very few of us even know are out there waiting to cause trouble. It is market failure in its purest form, right up there with Equifax, which none of us chooses to use but still inflicted intimate exposure on hundreds of millions of people; and the 7547 bug, which showed you can do everything right in buying network equipment and still get hammered.

It makes you wonder: when will people learn that you can’t avoid problems by denying there’s any risk? Biometric systems are typically intended to handle the data of millions of people in sensitive applications such as financial transactions and smartphone authentication. Wouldn’t you think security would be on the list of necessary features?

Illustrations: A 1930s FBI examiner at work (via FBI); James Wayman; Marta Gomez-Barrero.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard – or follow on Twitter.

Related Articles

Back to top button