Transparency continues to be a capability that governments around the world cannot demonstrate when it comes to biometric systems for their citizens.
Privacy advocates call for more visibility into the review, testing and exploitation of biometric databases and facial verification algorithms by governments, including projects in which agencies collaborate with businesses .
Programs in the UK and Australia are now in question, and a more exotic system deployed in Abu Dhabi started raising transparency issues in July.
Britain’s National Health Service, one of the few government units residents of all political stripes like, has not been transparent about a deal with authentication provider iProov to create an administrative app used by 10 millions of people.
The London-based company provides the technology for the app to collect and store facial biometrics for identity verification from video, according to The Guardian. People use the app for a number of bureaucratic tasks, including accessing medical records and obtaining a travel certificate for the COVID vaccine.
Although a contract was signed in 2019, according to the publication, the NHS had not recognized it on its site last week for security reasons. Security has also been cited as the reason the public cannot be presented with a data protection impact assessment.
For example, it would be unwise to say how long the data is stored, according to the article.
Worse, in the eyes of some, a spokesperson for the department said law enforcement could ask to see – but not demand – the data collected by the system.
In Australia, the government fired for silently deploying voluntary facial verification systems in the states of New South Wales and Victoria.
According to Reuters reports, the product is an app through which the government can be assured that someone who is subject to quarantine orders is, in fact, isolated.
Australian software maker Genvis had previously worked with Western Australia Police to deploy it in that state in November 2020. The expansion to the southeastern states of the country had not been announced.
Western Australian government leaders have reportedly banned use of the app in all roles unrelated to COVID.
This might be a hard sell for some Australians, given examples elsewhere of mission drift.
European news publisher Euractiv discovered this month that the role of a facial recognition system created by the Austrian federal government to fight crime has been quietly extended.
Part of a 2020 law, the system was sold as a way to find people suspected of serious crimes. Faces recorded by surveillance cameras are compared against a growing database, now with almost 640,000 entries.
According to Euractiv, the government belatedly said the system had since been used to investigate protesters.
Facial biometrics providers were also called on by investors earlier this year to increase their transparency in order to improve public trust.
best practices | biometrics | digital identity | facial recognition | identity verification | confidentiality | surveillance