After the 2001 terrorist attacks, interest in face recognition technology predictably surged. DARPA, NIST and other US government research groups stepped up evaluations of the use of face recognition systems at airports and border crossings. Charles Wilson, a scientist at NIST wrote one of the resulting reports. "It says you need 10 flat fingerprints and a face for registration [creating your database] and two fingerprints and a face for identification," Wilson says. "NIST has no intention of changing that recommendation."
In other words, unless the government plans on getting its most wanted to sit down and offer a full set of prints and a straight-on head shot under photo-friendly lighting, face recognition probably isn't yet ready to help much in the fight against terror.
Still, airports and cities have gone ahead with trials of the technology anyway. Why? Wilson explains, "Science has one degree of optimism, marketing has another."
Marketing forces (not to be confused with market forces) have a persistently problematic relationship with IT. In the absence of a fully developed technology, marketing will fill the void the way air rushes violently into a vacuum. In this particular case, the vacuum was a black hole of dread in our national psyche. As a result, we, the marketers’ audience, probably embraced the marketing message more ardently than usual. Something had to stop terrorism. Face recognition that could pluck terrorists from crowded airport lobbies and concourses was a pleasing vision.
Two months after the attacks, even though the market clearly could not provide a viable face recognition technology, marketing forces were fully able to conjure ideal solutions. Given a wish that cried out to be fulfilled, vendors made overheated declarations about the technology's prowess. Some vendor CEOs shuffled off to Congress to explain (in the words of one vendor's press release) "how facial-recognition technology can be used to thwart the entry into the US of persons who wish to carry out terrorist acts." Fresno airport put a system in and the city's mayor declared (again, in a vendor's press release), "This is a revolutionary advancement in public safety and it's a system every airport should have. The technology we now have in place will help prevent a repeat of the tragedies of September 11th."
Officials and vendors became a binary star system, held together by the same gravitational pull of glowing press. Public officials and agencies could declare they were doing something about terrorism; vendors could proclaim their patriotism by offering free trials of their wares. (Many were likely hoping that regulation would, down the line, drop a massive market right in their laps.)
Of course, eventually, marketing has to be buttressed by something practicable. Face recognition, for all its promise, still doesn't perform well enough in the real world, a place full of nuances, imprecision and people who will raise a stink about invasions of their privacy. Almost as soon as the trials started, things started to look ugly for face recognition.
Six months after Fresno deployed its "revolutionary" system, it swapped it out for another system because of accuracy problems. Not long after that, Palm Beach airport spiked a trial that produced only a 47 per cent accuracy rate. Then the City of Tampa killed a trial system which had for two years surveilled a downtown entertainment district and had resulted in exactly zero arrests and one rather embarrassing mishap. (On the nightly news, the cameras showed a man eating his lunch. A woman in Oklahoma saw the footage and accused the man of being her deadbeat husband. Police tracked down the man who, it turned out, had never been married and was understandably miffed.)
Logan Airport in Boston has just ended its own trial, which didn't fail exactly but didn't succeed either. Although the trial evaluation notes that the system met the accuracy criteria it had set out to achieve, using the system was an operational nightmare. Camera angles were wrong and some of the cameras were too high. Lighting didn't always cooperate. Plus, "the operators' workload is taxing and strenuous, requiring constant undivided attention and periodic relief, which amounts to a staffing minimum of two persons for one workstation," the report says.
NIST’s Wilson says, "[Airports] don't understand what's required to make biometrics work." More importantly, he says, they won't — indeed couldn't, and maybe even shouldn't — invest in what would make face recognition systems work today: controlled lighting, new cameras, photo booths at check in, fingerprint identification, more staff. "Why are we wasting all this time and money on trials?" asks Wilson.
That's the thing about marketing forces: The physics of the phenomenon are such that it tends to suck energy away from alternative solutions and thoughtful, measured responses in order to sustain itself. Ironically, hype impedes progress. Look no further than the Logan report for evidence of this. "The cost of research and development,” it states, “in conjunction with aggressive marketing strategies, has retarded the progress of developing a more mature technology."
And for credible punctuation on that idea, there's Bernard Bailey, the new CEO of Viisage, one of the face recognition vendors involved in the Logan trial. (Bailey was named CEO after the Logan trial ended.) In a newspaper story on the Logan trial, Bailey said, "I don't think that's the best use of our technology. The hype of this technology got way ahead of the capabilities of it."
Charles Wilson at NIST is irritated. "Face recognition is not a panacea. Science generally isn't. But you didn't have to run half-a-dozen airport trials to figure that out. You could have just read our report."
---------- "Alarmed" is a biweekly column about security and privacy. Look for a new version every other Thursday.