"EVIDENCE HEARING Current and future uses of biometric data and technologies, Wednesday 26 November 2014, Committee Room 15, Palace of Westminster
" – that's what the email
What's this all about?
Answer, the House of Commons Science and Technology Committee is looking into biometrics again, please see Current and future uses of biometric data and technologies
, with the following terms of reference:
The Science and Technology Committee is seeking written submissions on the state of development of technologies using biometric data, specifically:
- How might biometric data be applied in the future? Please give examples.
- What are the key challenges facing both Government and industry in developing, implementing and regulating new technologies that rely on biometric data? How might these be addressed?
- How effective is current legislation governing the ownership of biometric data and who can collect, store and use it?
- Should the Government be identifying priorities for research and development in biometric technologies? Why?
The Committee invites written submissions on these issues by midday on Friday 26 September 2014.
What would you say in your submission? Something like this perhaps?
To: Science and Technology Committee Subject: Current and future uses of biometric data and technologies 1. The questions the Committee asks are premature. Parliament needs to know first whether mass consumer biometrics technology can reliably verify people’s identity. The Committee has already asked that question, back in 2006. It is time the biometrics industry answered it convincingly. 2. Merely to ask the questions it does lays the Committee open to being used as part of the biometrics industry’s marketing campaign. “The United Kingdom House of Commons Science and Technology Committee takes us seriously enough to enquire what legislation is needed to cover the collection of biometric data”, they may say. All steps should be taken by the Committee to ensure that it is not pressed into service as a marketing aid for a flaky technology being promoted by an industry with a history of failure. 3. This submission considers mass consumer biometrics technology only and not the careful and precise biometrics involved in criminal trials. It is directed at the low-grade facial recognition biometrics used in schools, for example, and the informal flat print fingerprinting available on iPhones. 5. Biometrics is one of the technologies that were investigated and, in the course of a 62-page report, the Committee declared itself over 20 times to be “concerned”, it was “surprised” four times, “regretful” three times, “sceptical” twice and, once, the Committee was downright “incredulous” (para.103). 6. Hardly surprising. As the Committee’s report says at para.18: “There are no mutually-accepted standards for testing biometric technology and industry claims about performance vary widely”. The Home Office had elected to rely on biometrics for its identity card scheme while armed with nothing more than salesmen’s representations. 7. The report called for large-scale independent tests to be conducted to determine the reliability of biometrics and for the results to be published so that public confidence is inspired. That hasn’t happened. Biometrics at Manchester Airport 8. One thing that has happened is the following little episode. 9. On 16 April 2009 I wrote to Sir David Normington, Permanent Secretary at the Home Office at the time, recommending to him that biometrics statistics should be made official so that they could be used by government departments with the imprimatur of the Office for National Statistics. 10. At Sir David’s behest, a reply dated 26 June 2009was received from Mr Brodie Clark, Head of the Border Force at the UK Border Agency (UKBA) at the time. Mr Clark said that biometrics were being tested at Manchester Airport and Stansted and added: “The test’s findings demonstrated considerable improvement in this field [facial recognition], and confirmed that the technology could be applied successfully in a one-to-one (verification) mode”. 11. On 8 August 2009, I wrote to Lin Homer, Chief Executive of UKBA at the time, asking her to publish the results of the Manchester Airport biometrics test. 12. She responded in a letter dated 3 February 2010: “UKBA is currently trialling the use of automated gates using facial recognition technology at 10 sites across the UK ... The technology used has proved reliable within the operational environment ... Evaluation of Manchester gave us enough confidence to proceed to expand the trial”. 13. Which means that John Vine CBE QPM’s report, An inspection of border control at Manchester Airport 5-7 May 2010, made surprising reading. Mr Vine was at the time the Independent Chief Inspector of UKBA and he said in his report at para.5.29: “We could find no overall plan to evaluate the success or otherwise of the facial recognition gates at Manchester Airport and would urge the Agency to do so [as] soon as possible”. 14. The Committee had called for tests to be performed on biometrics and for the results to be published. Clearly that didn’t happen at Manchester Airport. Biometrics and ID cards – David Moss v Information Commissioner and the Home Office (EA/2011/0081) 15. Did it happen elsewhere? 16. It looks as though it did. 17. On 7 April 2009, the Home Office issued a press release (no longer available on the web), IBM awarded £265 million contract to develop system to manage biometrics for UK government ID cards, passports and visas/residence permitsand on 7 October 2009 the Safran Group issued a related press release about its subsidiary Sagem Sécurité (now known as “Morpho”) being awarded a contract by IBM to provide biometric technology following competitive trials conducted by IBM. Could the public see the report on those trials? 18. That was the question I asked in a Freedom of Information request dated 6 January 2010. 19. Two years later, on 24 April 2012, the First-Tier Tribunal (Information Rights) decided by two-to-one that no, the report could not be published. 20. Among other things, the Tribunal’s Decision noted at paras.32, 43 and 111 that the trial report was so specific that it could not inspire the public’s confidence in general in biometrics and at para.134 it quotes IBM’s evidence to the effect that publication of the trial report could have “no consequential informative value to the public”. 21. Another dead end. The Committee’s recommendation still hasn’t been followed. As far as we know, there is still no reason to believe that biometrics are reliable enough, in a large population like the UK’s, to be used to verify people’s identity. The reliability of biometrics 22. The Committee said in its previous report at para.18 that: “The Home Office has stated that it expects the following performance levels to be sufficient for its requirements in the identity cards scheme: · Face—failure to acquire rate close to zero, a false accept rate of 1% · Fingerprint—failure to acquire rate of 0.5-1%, false match rate of 1.3e-10, false non-match rate of 0.01 · Iris—failure to acquire rate of 0.5%, false non-match rate of 5% false match rate of 5e-12”. · The false reject rate for facial recognition was something like 50%. We’d all do better to rely on tossing an unbiased coin. · Far from 1%, the false non-match rate for flat print fingerprinting was more like 20%. If the right to work in the UK depended on flat print fingerprint tests, 20% of people would be wrongly denied that right. · And the failure to acquire rate for iris was 10%, not 0.5%. That’s for able-bodied participants in the trial. For the disabled, it was 39%. I.e. 39% of disabled people would not exist in a population register that depended on iris biometrics. 24. It is this enormous discrepancy between what is required of mass consumer biometrics and what is available that exercised the Committee all those years ago. That, and the Home Office’s inexplicable insistence on proceeding with ID cards anyway, even knowing that the biometrics they depended on weren’t reliable enough to verify people's identity. 26. The Unique Identification Authority of India (UIDAI) is ploughing on with its Aadhaar system, attempting to identify each one of 1.2 billion Indians uniquely on the basis of their biometrics. 27. That job – identification, as opposed to mere verification – is impossible according to Professor John Daugman, the inventor of biometrics based on irises. You would drown in a sea of false positives if you tried to prove uniqueness on a large national identity register. 28. This matter was investigated by the Center for Global Development in their report on Aadhaar. Footnote #7 on p.5 of their report says: “UIDAI plans to contain the numbers by eliminating some sources of error unearthed by the initial study, and also by relaxing the FAR [false accept rate/false positives] if needed to further reduce the FRR [false reject rate/false negatives]. Handling false rejections has reportedly been a manageable problem to date”. 29. Relaxing? More like abandoning. The two quantities are inversely related and you have to let false accepts go through the roof in order to keep false rejects down to manageable volumes. What that means in English is that UIDAI can’t deliver the promise in their name of unique identification based on biometrics. Biometrics is not a science 30. There has been one other relevant report published since the Committee called for evidence on the reliability of biometrics. 32. Two of the authors have advised the UK government, among others, on biometrics. Messrs Jim Wayman and Tony Mansfield. The third is the head of metrology – measuring things – at NIST itself, Mr Antonio Possolo. And according to them biometrics isn’t a science. It’s out of “statistical control”. 33. So much so that, when NIST have to certify biometric technology before it’s deployed, which they have to do under the provisions of the USA PATRIOT Act, the best they can say is that when they measured the technology, these were the results but the results in the field may be different: “For purpose of NIST PATRIOT Act certification this test certifies the accuracy of the participating systems on the datasets used in the test. This evaluation does not certify that any of the systems tested meet the requirements of any specific government application. This would require that factors not included in this test such as image quality, dataset size, cost, and required response time be included” (p.20). Biometrics is not a science. 34. At the end of 2013, Private Eye magazine named James McCormick Crook of the Year (Eye #1357, p.32). He is the man who bought novelty golf ball-finders and sold them as explosives detectors to governments whose gullibility or corruption must also be award-winning. 35. What is the difference between Mr McCormick’s business and the global mass consumer biometrics industry? The Committee’s questions are premature 36. “How might biometric data be applied in the future?”, the Committee now wants to know. People who collect data using today’s mass consumer biometrics technology are doing no more than people who collect stamps. That collection should not be funded by public money and need not be the subject of the Science and Technology Committee’s enquiries. 37. “What are the key challenges facing both Government and industry in developing, implementing and regulating new technologies that rely on biometric data? How might these be addressed?”, the Committee wants to know. Why? Would they bother to hold an enquiry if the question was “What are the key challenges facing both Government and industry in developing, implementing and regulating new technologies that rely on astrological data? How might these be addressed”?
38. “How effective is current legislation governing the ownership of biometric data and who can collect, store and use it?” This question only becomes important if it is first proved that mass consumer biometrics technology is capable of persuading a court that it can be reliably used to verify someone’s identity. We are not yet at that point.
39. “Should the Government be identifying priorities for research and development in biometric technologies?” Yes. In fact the Committee already has. Back in 2006. But the industry and Whitehall have ignored the Committee.
That's the submission DMossEsq would have liked to make, on time, by midday on 26 September 2014. Unfortunately, he didn't.
----------Updated 25.11.14 #1
By spooky coincidence, the day after the blog post above was published, the following article appeared on DNAIndia.com
|Move over Astrology, fingerprint analysis is new future|
When Neena Joshi (32), a banker based out of Delhi had a daughter six months back, the first thing she did was to do her "Dermatoglyphics Multiple Intelligences Testing" (DMIT) or fingerprint analysis done as soon as possible. Joshi and her software engineer hubby zeroed in on an expert and spent Rs5,000 to find out the innate qualities of their month-old child through her fingerprints ...
So, what is the dermatoglyphics?
How authentic and accurate the analysis is?
Since the test is very new, there is no data to substantiate the authenticity and accuracy of the test. However, an expert admits, "The accuracy rate of dermatoglyphics analysis can be up to 85%."
Meanwhile, over on PlanetBiometrics.com
) we read:
Updated 25.11.14 #2
|British lord takes charge of new election tech firm|
Labour peer and veteran British diplomat Lord Mark Malloch-Brown is set to become chairman of a new venture based in London ...
The new firm, called SGO, will become a parent firm for Smartmatic, a Venezuela-based company that had provided e-voting machines for recent biometrics-backed elections in the Philippines and Brazil.
Lord Malloch-Brown, who is listed as chairman of Smartmatic on the UK parliament’s website, told London’s Financial Times that he believed that elections technology can help emerging economies achieve development goals ...
In the coming year, SGO will launch three new companies, each working on spin-off products to tackle issues such as biometric identification, internet voting and climate change, wrote the FT ...
06/10/2014 Biometric teething issues in Brazil vote
Updated 28.11.14Two hours of biometrics evidence given to the Committee, 26 November
During the evidence session above there was some discussion of the privacy implications of biometrics and the security requirements. Open standards can help but they're often ignored by small innovative companies. No-one has a clue what Apple and others are doing with your biometrics. There is no "public visibility", Sir John Adye said at about 10:37, it's a "jungle" (~10:40).
Regulation might help. It would have to be international. There is no international body capable of enforcement. So it might not help. Biometrics, Sir John said (~10:28), could help with anonymisation. Quite how a technology designed to identify you can help to render you anonymous was not made clear.
These matters are only important if the technology works. Does it? The witnesses were careful to emphasise that biometrics is probabilistic. It sort of works. It can't say that you are you or conversely that you're not the person you claim to be. All it can say is that there is a probability that you're you.
What's the highest probability available? No answer. The question wasn't asked.
What probability is acceptable? It depends on the application. For low value transactions or low security access, the probability doesn't need to be high. We knew that.
Biometric authentication isn't sufficient on its own, said many of the witnesses, it needs to be accompanied by PINs and passwords and digital certificates. How much value does biometrics add to those other tokens? No answer. The question wasn't asked. If it's low, will the costs of the biometrics outweigh the benefits?
When you go through security at the airport the photograph of your face will never match the template stored on the chip in your passport perfectly. 100%. It can't. The cameras used will be different, the lighting conditions will be different, your distance from the camera in each case will be different and your face will have changed in the intervening time – up to ten years. So if the matching threshold is set at 100%, no-one will be you. Not even you.
It has to be set lower. But then, not only will you be you, so will a number of other people. Set the matching threshold to 0% and everyone can be you.
Your identity may or may not be verified by any given biometrics system. It all depends on the matching threshold, the setting of which is discretionary. It's up to the operator of the biometrics system.
But that's not what we normally mean by "identity". Identity isn't the discretionary gift of a stranger. Not normally. So what are these biometrics systems verifying? No answer. The question wasn't asked.
Watching films like Minority Report
and TV series like CSI
– specifically referred to by Professor Sue Black which inspired much jollity – gives laymen including politicians the impression that biometrics are perfectly reliable. They're not. They may work on screen. They don't work at Heathrow.
If the failure rate is something under 1%, say, then the public might approve of the Committee's efforts – as long as there are plenty of safeguards to ensure that on those occasions when your identity can't be biometrically verified there are other ways to prove your right to work in the UK or your right to non-emergency state healthcare or your children's right to state education. The public might approve.
But if the failure rate is more like 50% (face recognition) or 20% (flat print fingerprinting) where a miss is as good as a mile, the claims to reliability are laughable, then the public won't approve.
But more rules? No. Why have an army of regulators all expensively trained with administrative headquarters to maintain and international conferences to attend imposing paperwork on companies who have to recruit consultants to advise them how to survive inspection by the regulators needed to protect the public against a technology that doesn't work anyway?Updated 4.12.14
Some people will remember Brodie Clark
, the sometime Head of the UK Border Force. He lost his job for suspending the use of a technology that doesn't work
– biometrics. That kind of nonsense is the result of Whitehall policy being made on the basis of wishful thinking rather than evidence.
Dame Helen Ghosh, Permanent Secretary at the Home Office at the time, told the Home Affairs Select Committee on 22 November 2011 that the Border Force would be reduced by about 900 staff
: "... that is driven as much by technological introductions like e-gates, as well as a risk-based approach. Border Force will be getting smaller ...".
Eight months later on 30 July 2012 the Times
published Border Force in recruitment drive U-turn
: "Hundreds of new immigration officers are to be recruited by the UK Border Force weeks after it disclosed that 450 staff were cut last year ...". Fire them, re-hire them, that kind of nonsense also is the result of Whitehall policy being made on the basis of wishful thinking rather than evidence.
The House of Commons Science and Technology Committee have the opportunity in this investigation into biometrics
to rank evidence above wishful thinking.Updated 11.12.14Two hours of biometrics evidence given to the Committee, 10 December 2011Updated 9.3.15 #1
The Science and Technology Committee has now published its report, Current and future uses of biometric data and technologies
and an excellent report it is, too.
The government depends on biometrics being reliable most notably at the Home Office for crime detection
, for example, border control
Just how reliable are these biometrics?
The Committee took evidence from Professors Sue Black and Niamh NicDaeid, among others. Both based at the University of Dundee, Professor Black is the Director of the Centre for Anatomy and Human Identification while Professor NicDaeid is the Director for Forensic Research within the same Centre for Anatomy and Human Identification.
They know what they're talking about and they submitted evidence
jointly which includes this quotation from Professor Michael J Saks
at Arizona State University:
|From the viewpoint of conventional science, the forensic identification sciences are contenders for being the shoddiest science offered to the courts. After being in business for nearly a century, they still have developed little that would be recognised as a scientific foundation and, consequently, have little basic science to apply to their operational activities.|
Professors Black and NicDaeid do not disagree with Professor Saks. They say:
|The judicial setting increasingly demands a robustness that will satisfy legal admissibility testing and there is strong evidence to support the growing concern that most current biometrics fail to have a sufficiently robust research foundation to reach a meaningful admissibility threshold (para.2).|
|Many current biometric methods receive only minimal scientific grounding and the rigour of testing can often be inadequate, making the degree of reliability and confidence in the biometric open to significant and justified challenge. As a consequence the results of the interpretation of biometric data derived from analysis, as well as the analysis itself, can be called into question by both the user and the assessor thereby generating distrust and suspicion ... We need more science in biometrics (para.3).|
But surely biometric technologies are improving all the time? This is a fast-moving field and the government are having trouble keeping up, aren't they? No:
|There is often a primary wave of interest and application in a novel or emerging biometric e.g. vein pattern analysis, but the underlying solid research is underfunded or non-existent and as a consequence the innovation risks failure in its implementation. This has a direct impact on the robustness of the biometric and confidence in its utilisation and its effectiveness. A new strategic research focus is required to prevent this cycle of failure being replicated (para.9) ...|
In terms of forensic biometrics, there have been no new significant developments since DNA nearly 30 years ago (para.14).
"A weak biometric is a dangerous biometric" they say and, without "a new strategic research focus", all we're getting by way of mass consumer biometrics is weak biometrics, dangerous biometrics, there is nothing to "prevent this cycle of failure from being replicated".
That's their view.
You may disagree.
But you'd better have a good reason for disagreeing.
Better than having seen films like Minority Report
and believing that what you see is real.
Better than saying that you can't believe Whitehall would back a technology that doesn't work – that's exactly what Professors Black and NicDaeid are saying Whitehall have done.
What about all the other states that rely on biometrics? India
, for example, with their Aadhaar
system. Or New Mexico. Or the Gulf. Or Kenya.
Well what about them?
What do they know and what do you know that Professors Saks, Black and NicDaeid don't know?
Or Professor Louise Amoore of Durham University, reported by the Committee, as saying:
|... those writing “extraction algorithms and matching algorithms … are quite candid that [they are] probabilistic” and see it as the responsibility of “the people who are buying the [biometric] system” to state “what sort of tolerance they have for the false acceptance rate versus the false reject rate”. She stressed that this “doubt in the science [was] present in the room when … writing the code” but that such doubt was “lost by the time [the code was] part of the hardware technology being used (para.47).|
Or Andrew Tyrer, the Interim Head of Enabling Technology at Innovate UK
(the Technology Strategy Board), what do you know and what do they know that he doesn't?
|Concerns were also raised about the independence of the testing regime. Andrew Tyrer, Innovate UK, told the Committee that a “lot of the rigour that goes into testing” was “quite often fronted by the manufacturers themselves”. He went on to identify a “gap in the market […] around the testing of systems” on the grounds that there was currently “not a lot of independent activity” (para.50).|
None of this can come as a surprise to regular DMossEsq readers who have been entertained for years with the views of Messrs Wayman, Possolo and Mansfield
to the effect that biometrics just isn't a science – it's out of statistical control
They are not alone and the Committee is to be thanked for adducing more evidence in their cause and in the cause of the UK taxpayer, whose money it is being poured down the drain.
There is a lot more in the Committee's report than just this but this is the background against which the rest must be seen, a "cycle of failure being replicated".Updated 9.3.15 #2
The Science and Technology Committee has now published its report, Current and future uses of biometric data and technologies
and an excellent report it is, too.
"Not so the DMossEsq commentary
on it", you may say, "all you've done is cherry-pick all the critics and ignore all the advocates of biometrics who testify that the technology is reliable".
Prove it. There are 10 current members of the Committee. They took oral evidence from 14 experts and they received 33 written submissions to their enquiry. Find one statement made by the Committee or its witnesses to the effect that you can rely on mass consumer biometrics and that the technology has innumerable successes to its name.
The current state of affairs according to the Committee's report is:
- There is a lack of independent testing of biometrics technology. Whitehall and others tend to have to rely on the manufacturers' own testing.
- The scientists/engineers who develop biometrics products may admit that the technology is only probabilistic ("there is a 70% chance that you are Winona Ryder") but the salesmen promote it as deterministic ("you are Winona Ryder").
- Biometrics is "the shoddiest science offered to the courts" and is locked in a "cycle of failure". Also:
|Dr Guest [University of Kent] suggested that “consumer-level” mobile biometrics—like the iPhone 6—currently had something of a “gimmick value” (para.16).|
At which point the question arises how come Whitehall spends our money on systems whose success depends on the flaky reliability of biometrics?
The Committee's answer is that it's a farce.
They take as their example the Police National Database (PND). When people are held in police custody, their photograph is taken. The police have uploaded several million of these photographs onto the PND. Now read on (para.95):
|The Association of Chief Police Officers (ACPO) described facial recognition as “a less well developed area of biometrics”, though it noted that police have taken photographs of suspects during the custody process “for many years”. ACPO stated that these images had recently “been held digitally” and were “capable of being used within the emerging science of facial recognition”. However, ACPO did not state in its written evidence if this “capability” was operational. Speaking to the Committee, Chief Constable Chris Sims, ACPO, clarified that he was:|
not aware of forces using facial image software at the moment. There are certainly lots of discussions and there has been some piloting, but from my perspective the technology is not yet at the maturity where it could be deployed, so issues as to how it is used sit as a future debate rather than a current one.
Face recognition doesn't work, according to ACPO, but nevertheless the police have taken the trouble to upload custody photographs onto the PND even though they're not using the face recognition facilities offered, according to ACPO's representative.
No sooner has the question formed in your head – so why upload the photographs? – than the Biometrics Commissioner disagrees (para.96):
|Mr Alastair MacGregor, Biometrics Commissioner, told us that he was “slightly surprised by some of what [Chief Constable Sims] has said”: it was his “understanding that 12 million-plus custody photographs” had been “uploaded to the PND [Police National Database] and that facial recognition software [was] being applied to them”. When asked to respond to Mr MacGregor’s comments, Chief Constable Sims replied that he too was “surprised” by what he had heard, adding that he “certainly did not think it was an operational reality” before stressing that facial recognition was not his “area of specialty”.|
If this isn't his "area of specialty", what is Chief Constable Sims doing giving evidence to a Science and Technology Committee enquiry into biometrics?
It's a mystery. But no time for that now, there's another mystery. Why is it only the Biometrics Commissioner's understanding
that the police are using face recognition on 12 million+ custody photographs? He's the Biometrics Commissioner. Why doesn't he know
Here the mystery deepens. At para.102 the Committee tell us that:
|Mr MacGregor was clear that his statutory responsibilities as Biometrics Commissioner related “only to DNA and fingerprints ...”|
We've got a Biometrics Commissioner who's allowed to oversee DNA and fingerprints but not faces. How did that happen?
Deeper still, the mystery goes (para.97):
|The Biometrics Commissioner echoed the [Information Commissioner's Office] point and questioned how “appropriate” it was for the police to put “a searchable database of custody photographs” into “operational use” in the absence of any “proper and effective regulatory regime … beyond that provided for in the Data Protection Act 1998”. He added that the custody photographs loaded on to the PND included “those of hundreds of thousands of individuals who have never been charged with, let alone convicted of, an offence”.|
He's not trying to tell us that this upload of 12 million+ custody photographs is illegal, is he? Yes, he is. The High Court have found against the Metropolitan Police Service (para.98):
|Rather than require “the immediate destruction of the claimants’ photographs”, the Court allowed “the defendant a reasonable further period within which to revise the existing policy” while clarifying that a “reasonable further period” was to be “measured in months, not years”. Over two and half years later, no revised policy has been published.|
How does that happen? How can the police go on ignoring the High Court and doing something illegal for 2½ years?
Isn't there some grand Whitehall strategy behind all this activity?
Of course there is. But there's another mystery:
|Established in 1994, Foresight is the Government Office for Science’s centre for futures analysis. Its role is to help “the UK Government to think systematically about the future” in order to “ensure today’s decisions are robust to future uncertainties”. It is somewhat striking, then, that Foresight's 2013 report Future Identities: changing identities in the UK states on page 6 that biometric identities “were beyond the scope of the project” and therefore provides no evidence or advice to Government on biometrics.|
No use looking to Foresight for any foresight on biometrics, is there anyone else?
There used to be. There was the Biometrics Expert Group but that's been disbanded (para.30). As has the Biometrics Assurance Group (ibid.
Now there's the Forensics and Biometrics Policy Group (para.32). Or is there?
|It was noticeable that the work of the forensics and biometric policy group was not directly referred to anywhere in the written evidence. We have come across this group before and have previously raised concerns about its lack of transparency and its failure to publish the minutes of its meetings (para.33) ...|
Without any information about the status of the forensics and biometric policy group—particularly with regard to its independence, or otherwise, from Government—it is not clear whether the Principles [of scientific advice to government] should apply ... when confronted with the status quo, and asked whether it “would help public confidence” if the discussions of the group were “transparent”, the Minister [Lord Bates, Parliamentary Under-Secretary of State for Criminal Information, Home Office] agreed that was “broadly what should happen” though he offered no explanation as to why this had, so far, failed to occur (para.34).
Despite a previous assurance from the Government, given over 12 months ago, that the publication of the forensics and biometric policy group’s minutes was on the horizon, this has not occurred (para.35).
And what's more:
|The need for a Government biometrics strategy?|
37. We have longstanding concerns about the absence of a clear Government strategy and were therefore encouraged by the Government’s reassurance, given in response to our 2013 report, Forensic Science, that it was “drawing up a biometric and forensic strategy to be completed by the end of the year ”. It is now early 2015 and no strategy has been published.
There is more. But that's enough for now.
There's no strategy. The Biometrics Commissioner has no responsibility for biometrics based on face recognition. The Home office have been breaking the law for 2½ years in order to use a technology which they say they don't use and which they say doesn't work. Make what sense of it you can.
The public should extend its thanks to the House of Commons Science and Technology Committee for this lethal insight into the misfeasance of Whitehall.Updated 12.3.15 #1
Out of statistical controlFeasibility Study on the Use of Biometrics in an Entitlement Scheme
, February 2003, is one of the seminal papers in the modern literature of biometrics. Written by Tony Mansfield and Marek Rejman-Greene, it constitutes a primer in mass consumer biometrics, complete with responsible warnings about the limitations of the technology.
Seven years later, Tony Mansfield published Fundamental issues in biometric performance testing: A modern statistical and philosophical framework for uncertainty assessment
, co-authored with James Wayman and Antonio Possolo. The three of them argue there that biometrics is not a science. It's out of statistical control.
By that they mean among other things that neither laboratory tests nor field trials can tell you what to expect when biometrics systems are in operational use: "We can conclude that the three types of tests [technology, scenario and operational] are measuring incommensurate quantities and therefore [we] should not be at all surprised when the values for the same technologies vary widely and unpredictably over the three types of tests" (p.21).
What are we to make in that case of the evidence given to the Committe by Tony Mansfield's sometime collaborator Marek Rejman-Greene during which he said that the Immigration and Asylum Biometrics System (IABS) was "tested prior to delivery" (para.53
)? If Messrs Wayman, Possolo and Mansfield are right, that testing will have told the Home Office nothing about how the IABS would perform in operational use.
As it happens, Mr Rejman-Greene told the Committee nothing about how reliable IABS is. For all we know, it is a waste of time and money and performs no useful service.Updated 12.3.15 #2
"Recent 'breaches of security', including the 'Snowden incident', have made the public increasingly sceptical about who has access to their biometric data and whether it is stored securely" – so say the Committee at para.70
of their excellent report.
are hacked. JP Morgan Chase
are hacked. The US State Department
are hacked. US defence contractors
are hacked and so are the security experts advising them.
By this stage, the public may not be "increasingly sceptical" – we may have got past that and now realise that security on the web is a will o' the wisp. In the face of the successes of the black hat hackers, promises of web security are old-fashioned, unrealistic or downright laughable.
While the cost of losses is bearable, banks will continue to offer services over the web. And we will continue to use them. Because by law it is the bank that suffers the fraud. We accountholders get compensated.
It is quite conceivable that the banks should stop offering their services over the web if the cost of fraud exceeds what is bearable.
What is not conceivable is that the public will put up with security breaches of biometrics databases without there being a liability and compensation model in place similar to the banks'. There is no sign in the Committee's report of any such model.Updated 13.3.15The Committee's excellent report
identifies three trends in the use of biometrics. The first is "mobile", i.e. the use of biometrics in mobile devices:
|Northrop Grumman predicted that “biometric applications for … mobile devices [would] proliferate” ... significant development ... mobile payments ... innovative contexts ... paradigm shift ... method of assuring identity ... (para.15)|
A number of mobile biometrics are already in operation. Barclays, for example, announced that from 2015 it would be rolling out a biometric reader to access accounts for its corporate clients, instead of using a password or PIN ... Dr Guest suggested that “consumer-level” mobile biometrics—like the iPhone 6—currently had something of a “gimmick value” (para.16).
What examples are there of mobile biometrics applications? An upcoming Barclays Bank one, apparently, and the iPhone 6. That's it.
But it isn't. Why didn't Northrop Grumman mention their work with the old National Policing and Improvement Agency on Project Lantern
The idea was that policemen on the beat could scan a suspect's fingerprints, in the street, using a mobile device, supplied by 3M, and decide on the basis of a match with our national fingerprint database, IDENT1, developed by Northrop Grumman, whether to take the suspect down to the station, please see for example Metropolitan police goes live with mobile fingerprint scanners
, from the 24 May 2012 edition of the Guardian
Has Project Lantern been a roaring success? In which case why not mention it to the Committee in an enquiry into the current uses of biometrics?
Or has it been a flop? In which case what is there to stop the Barclays Bank and iPhone 6 biometrics applications also being flops?Updated 14.3.15The Committee's excellent report
includes a section on testing biometrics, paras.48-56. The success of several government initiatives depends on the reliability of biometric technology. The Committee recommend that the biometrics used should be tested both before and after government systems go live and that the results should be published:
|54. When biometric systems are employed by the state in ways that impact upon citizens’ civil liberties, it is imperative that they are accurate and dependable. Rigorous testing and evaluation must therefore be undertaken prior to, and after, deployment, and details of performance levels published. It is highly regrettable that testing of the ‘facial matching technology’ employed by the police does not appear to have occurred prior to the searchable national database of custody photographs going live. While we recognise that testing biometric systems is both technically challenging and expensive, this does not mean it can be neglected.|
That seems unobjectionable. Wasting public money uploading 12 million+ custody photographs onto the Police National Database on the offchance that face recognition technology might suddenly start to work after 50 years of failure is indefensibly irresponsible.
Are there any cases of a UK public authority testing the reliability of biometrics before
deployment? Yes, para.53:
|Mr Rejman-Greene [Home Office] highlighted the Immigration and Asylum Biometric System which, he told the Committee, had been “tested prior to delivery”.|
That's what Mr Rejman-Greene says.
There's a lot that he doesn't say.
The success of the Immigration and Asylum Biometric System (IABS) depends on the reliability of face recognition and flat print fingerprinting. IBM are the lead contractor on IABS.
The Home Office provided IBM with millions of fingerprints to test with. This was established at a meeting held at the Home Office on 23 February 2010
. There is no sign that any testing was conducted on face recognition. Flat print fingerprints, yes. Face recognition, no.
The results of IBM's testing for IABS were presented at a conference hosted by the US National Institute of Standards and Technology (NIST) between 1 and 5 March 2010
, see presentation #57. That presentation discussed flat print fingerprint tests only. Not face recognition.
It looks as though the flat print fingerprinting part of IABS was tested prior to deployment. It is not clear that the face recognition part was tested – reminiscent of the testing of face recognition at Manchester Airport
which the Home Office claimed they'd done but which the Independent Chief Inspector of the UK Border Agency could find no signs of.
The public did not attend that NIST conference. Have the results of IBM's IABS tests been published elsewhere?
A Freedom of Information request had already been submitted on 6 January 2010
asking the Home Office to disclose the results. 839 days later on 24 April 2012, at the conclusion of David Moss v. Information Commissioner and the Home Office
, the majority Decision
of the First-tier Tribunal (Information Rights) was received, agreeing with the Home Office that the results should not be published: "... disclosure would not legitimately inform public or scientific debate on the efficiency of biometric recognition systems (para.111)".
Mr Rejman-Greene advised the Home Office on IABS and he attended both the 23 February 2010 meeting at the Home Office and the NIST conference a week later. He told the Committee that IABS was tested before deployment. He didn't tell them that seeing the results wouldn't teach the public anything about "the efficiency of biometric recognition systems".
Paras.50-52 of the Committee's report are taken up with the question of independent testing. It is unsatisfactory that the Home Office tend to rely on biometrics tests conducted by their suppliers.
At para.49 the Committee raise the issue whether testing biometrics actually has any value, i.e. is
biometrics a science? This goes back to the paper written by Messrs Wayman, Possolo and Mansfield
, three world-class leaders in their field. 3M, Northrop Grumman and the Home Office represented by Mr Rejman-Greene offer no response.
Is it possible that Mr Rejman-Greene simply hasn't come across the Wayman, Possolo and Mansfield paper?
Yes, but it's unlikely. That paper was delivered at the the NIST conference he attended, please see presentation #59
. He had collaborated with Dr Mansfield on the February 2003
feasibility study for the Home Office. The explosive claims made in the paper must have reverberated around the biometrics world. And yet he expects the Committee to find it confidence-inspiring that IABS was tested before deployment.
At para.48 the Committee say:
|Several witnesses ... commented on the challenges associated with establishing “a comprehensive dataset of subjects with an unbiased population to test against”. Ben Fairhead, 3M, likened the process to that of a “drugs trial”. He explained that “to really prove the accuracy of a new biometric modality” it needed to be tested “with a large number of different people” which, he stated, was “quite expensive”; a point reiterated by Mr Marek Rejman-Greene, Home Office. Erik Bowman, Northrop Grumman, agreed and identified the lack of “availability” of large datasets for testing purposes as a potential barrier to advancing biometrics.|
Erik Bowman's complaint about the lack of large datasets for testing biometrics is undermined when you remember that the police have 12 million+ custody photographs available and that the Home Office gave IBM 10 million
It may be hard work and expensive assembling a large representative sample of the population to test biometrics. But then it's even more expensive irresponsibly to deploy a national mass consumer biometrics system only to find that it doesn't work.
No-one mentioned that the last time the Home Office bothered to do the job properly was 2004, when face recognition, flat print fingerprinting and iris scanning were tested. The May 2005 report on the field trial involving 10,000 of us demonstrated that none of these technologies work, please see UK Passport Service Biometrics Enrolment Trial
The results were disregarded because the Home Office wanted to use biometrics anyway, whether or not they work – as early as July 2002
the Home Office had promised biometrics with their plans for introducing government ID cards into the UK and they weren't going to let the facts stand in the way.
Again not mentioned, in their July 2006 report, Identity Card Technologies: Scientific Advice, Risk and Evidence
, to which Mr Rejman-Greene gave evidence, the Committee asked the Home Office to choose biometrics on the basis of proper testing and to publish the results of those tests in order to boost public confidence in the technology.
Nine years later it seems that the cart is still before the horse. And that the problem of tests producing the wrong results, that need to be published, has been solved, by not conducting tests.Updated 16.3.15
25 February 2015 is the publication date of the Committee's excellent biometrics report
and there's Chief Constable Sims talking about face recognition at para.95 saying "the technology is not yet at the maturity where it could be deployed".
In the Committee's earlier report, 20 July 2006, Identity Card Technologies: Scientific Advice, Risk and Evidence
, the Home Office expected face recognition to have a false accept rate of 1%, please see para.18. We've moved a long way in the intervening nine years. A long way backwards – now, the technology is too immature to deploy.
The Home Office expected iris scanning to have a failure to acquire rate of 0.5%. Somewhat optimistic. In the 2004 biometrics enrolment trial
, 10% of the able-bodied participants couldn't be enrolled, their iris scan couldn't be acquired, and that figure rose to 39% for disabled participants. 39% of the disabled population wouldn't exist if that depended on their entry in a register based on iris scanning. Not surprisingly, iris scanning has now been entirely dropped at UK airports.
The Home Office further told the Committee back in 2006 that they expected flat print fingerprint verification to have a false non-match rate of "0.01", i.e. 1%, with the false match rate set down close to zero. They knew perfectly well that that was not achievable. The 2004 trial came up with a figure for false non-matches more like 20%.
Time was, when the world still believed that mass consumer biometrics is a science, that percentages like this were bandied around all over the world.
No longer. Not a single percentage measuring the reliability of biometrics appears in the 2015 report produced by the Committee.
Such figures have fallen out of favour. Either they suggest that the technology is too unreliable to invest in. Or they suggest that the technology is reliable enough to invest in, thereby becoming warranties, which no-one wants to give because the suppliers don't trust their own technology to work. And if they don't, why should we the taxpayer?
Last May, 2014, the Major Projects Authority published Transparency data: Home Office government major project portfolio data
. Talking about the Immigration and Asylum Biometric System (IABS), they said: "A final review of the programme concluded that IABS has been delivered successfully and is functioning to users’ full satisfaction". No figures. "Successful", "full satisfaction", but no figures.
In what sense is IABS "successful"? How does anyone including the Major Projects Authority come to experience "full satisfaction" with IABS? The answers are opaque can't have anything to do with science.
Post-colonial carpet-baggers are making a fortune selling biometrics systems all over the world, notably to African countries, making promises about electronic voting and the distribution of aid and benefits but with no quantitative warranties.
The same in South America and Asia.
But there is one exception. Aadhaar, the Indian biometrics system, operated by UIDAI
, the Unique Identification Authority of India.
Aadhaar promises the earth. With numbers on. Unaudited numbers, please see India boldly takes biometrics where no country has gone before
: "Unlike other programs, [Aadhaar] reported its performance: Failure-to-enroll rate of 0.14%, False reject rate (FPIR) of 0.057% and false accept rate (FNIR) of 0.035%" (p.1).
Aadhaar is multi-modal. It relies on both flat print fingerprints and iris scans: "... iris capture has improved the system 10 to 1000 times [well which?] ... the iris decision alone turned the UID system into a roaring biometrics success and averted a potentially catastrophic failure" (p.3).
And Aadhaar doesn't rely on a single algorithm to do matching. It uses several: "UIDAI’s decision to use three ABIS [Automated Biometric Identification System] initially and allow dynamic replacement of ABIS was received with incredulous stares by the industry" (p.3).
It is because of the absence of irresponsible pseudo-scientific claims that the Committee's report has been received with no incredulous stares at all and a good thing, too:
- Their report reveals the feckless inanity of the police's custody photograph charade.
- It raises doubts about the claimed success of the Home Office's IABS.
- There is a reference to the Government Digital Service's identity assurance programme, "GOV.UK Verify (RIP)" as it's known (paras.25-6), which requires biometrics in order to achieve level of assurance 4 – as GDS show no signs of being able to achieve level of assurance 2, that projected "catastrophic failure"/waste of money needn't detain us for many years yet, if ever.
While the rest of the world wantonly wastes its money on biometrics, the Committee is to be congratulated on doing an excellent job of keeping the UK government's expenditure of your money and mine to a minimum.