Tuesday 29 May 2012

Protecting civilisation from the fingers of terror

Here's a quotation from an article in New Scientist magazine. You need to know that Visionics is a biometrics company that specialises in face recognition. Now you're an expert:
Airport security isn't the only use for face-recognition software: it has been put through its paces in other settings, too. One example is "face in the crowd" on-street surveillance, made notorious by a trial in the London Borough of Newham. Since 1998, some of the borough's CCTV cameras have been feeding images to a face-recognition system supplied by Visionics, and Newham has been cited by the company as a success and a vision of the future of policing. But in June this year, the police admitted to The Guardian newspaper that the Newham system had never even matched the face of a person on the street to a photo in its database of known offenders, let alone led to an arrest.
Admitted ... the police admitted ...

Clearly, the Newham police, for all sorts of human reasons, somehow entrapped themselves in a deception perpetrated on the public at public expense. Has it happened again?

Last week, Assistant Commissioner Mark Rowley was singing the praises of the mobile fingerprint readers now issued to policemen patrolling in 28 of the UK's 56 police forces. Home Office figures suggest that the flat print fingerprint technology used in these devices fails about 20% of the time.

Equally clearly, and to the credit of the Newham police, they finally extricated themselves from this fraud with their admission. Will that happen again?

How long before we read in New Scientist that:
... Assistant Commissioner Mark Rowley admitted to __________ that the MobileID initiative had never even matched the fingerprints of a person on the street to a set of dabs in its database of known offenders, let alone led to an arrest. In fact all it had achieved was to reduce the chances of a felon being taken down to the nick by a straight 20% at a stroke.
For anyone interested in the history of biometrics companies, i.e. how we got into this mess, please note that:
Please note also that the New Scientist article quoted above appeared in the 7 September 2002 issue of the magazine, nearly 10 years ago. The article is so full of important observations of mendacity, opportunism and technological incompetence still relevant today that it is further quoted with grateful acknowledgement below:
Face-off
I CAME here looking for an argument but I can't find one. All round this lofty exhibition hall - billed as the world's biggest market for security equipment - the people selling face-recognition systems are being disarmingly, infuriatingly honest ... I thought they'd at least attempt to defend the technology. When they don't, it's me who's caught off guard. Is it true that the systems can't recognise someone wearing sunglasses? Yes, they say. Is it true that if you turn your head and look to one side of the camera, it can't pick you out? Again, yes. What about if you simply don't keep your head still? They nod.

Maybe nine or ten months ago they would have risen to the bait. In those days the face-recognition industry was on a high. In the wake of 11 September, Visionics, a leading manufacturer, issued a fact sheet explaining how its technology could enhance airport security. They called it "Protecting civilization from the faces of terror". The company's share price skyrocketed, as did the stocks of other face-recognition companies, and airports across the globe began installing the software and running trials. As the results start to come in, however, the gloss is wearing off. No matter what you might have heard about face-recognition software, Big Brother it ain't ...

Image Metrics, a British company that develops image-recognition software, ... warned of the danger of exaggerated claims, saying that "an ineffective or poorly applied security technology is as dangerous as a poorly tested or inappropriately prescribed drug" ... to catch 90 per cent of suspects at an airport, face-recognition software would have to raise a huge number of false alarms. One in three people would end up being dragged out of the line - and that's assuming everyone looks straight at the camera and makes no effort to disguise themselves ...

Palm Beach International Airport in Florida released the initial results of a trial using a Visionics face-recognition system. The airport authorities loaded the system with photographs of 250 people, 15 of whom were airport employees. The idea was that the system would recognise these employees every time they passed in front of a camera. But, the airport authorities admitted, the system only recognised the volunteers 47 per cent of the time while raising two or three false alarms per hour ...

To give themselves the best chance of picking up suspects, operators can set the software so that it doesn't have to make an exact match before it raises the alarm. But there's a price to pay: the more potential suspects you pick up, the more false alarms you get. You have to get the balance just right. Visionics - now called Identix after merging with a fingerprint-scanning company in June - is quick to blame its system's lacklustre performance on operators getting these settings wrong ...

Numerous studies have shown that people are surprisingly bad at matching photos to real faces. A 1997 experiment to investigate the value of photo IDs on credit cards concluded that cashiers were unable to tell whether or not photographs matched the faces of the people holding them. The test, published in Applied Cognitive Psychology (vol 11, p 211), found that around 66 per cent of cashiers wrongly rejected a transaction and more than 50 per cent accepted a transaction they should have turned down. The report concluded that people's ability to match faces to photographs was so poor that introducing photo IDs on credit cards could actually increase fraud.

The way people change as they age could also be a problem. A study by the US National Institute of Standards and Technology investigated what happens when a face-recognition system tries to match up two sets of mugshots taken 18 months apart. It failed dismally, with a success rate of only 57 per cent.

There's another fundamental problem with using face-recognition software to spot terrorists: good pictures of suspects are hard to come by ...

Very few security personnel at American airports have CIA clearance, so they aren't allowed to see the images. "Until they've got cleared personnel in each of those airports they can't stop terrorists getting on planes," says Iain Drummond, chief executive of Imagis technologies, a biometrics company based in Vancouver, Canada ...

Airport security isn't the only use for face-recognition software: it has been put through its paces in other settings, too. One example is "face in the crowd" on-street surveillance, made notorious by a trial in the London Borough of Newham. Since 1998, some of the borough's CCTV cameras have been feeding images to a face-recognition system supplied by Visionics, and Newham has been cited by the company as a success and a vision of the future of policing. But in June this year, the police admitted to The Guardian newspaper that the Newham system had never even matched the face of a person on the street to a photo in its database of known offenders, let alone led to an arrest.
There are more of these gems available in the DMossEsq treasure trove of mendacity, Biometrics: guilty until proven innocent.

Look at the Image Metrics quotation above, "an ineffective or poorly applied security technology is as dangerous as a poorly tested or inappropriately prescribed drug". Prescription drugs are subject to extensive testing before the regulators will sanction their release to the public. Without that, we'd all be dead. The same goes for aircraft design. Without the Civil Aviation Authority, a lot more of us would be dead.

There is none of that open, public, peer-reviewed testing regime when it comes to the government wasting our money on biometrics. Try to find out what justification there is for Whitehall's decision to invest in biometrics and you get a two-year court case and no information.

There is no good reason for this peculiar asymmetry.

How do we avoid the recurrence of Newham-style embarrassments?

It's about time the Office for National Statistics was involved in Whitehall technology decisions and that initiatives which depend on reliable technology should not be allowed to incur substantial public expenditure before and unless the ONS has agreed and published official statistics supporting the business case.

No comments:

Post a Comment