Biometric technologies: the Defender of Rights wants to be alarmist

In a report published on July 19, 2021, the Defender of Rights delivers a technophobic look at biometric technologies and recommends the implementation of numerous safeguards that would almost completely prohibit their use.

For the Defender of Rights, facial recognition has no place among the tools of the police

Biometric technologies (facial recognition, storage and reading of fingerprints, voice biometrics), are they leading to progress or, on the contrary, are they likely to cause harm only? For the Defender of Rights, no doubts: he sees these technologies only as a threat. In a report dedicated to this subject, he cites, for example, documented cases where people belonging to ethnic minorities had been arrested or even imprisoned unjustly following a false positive from facial recognition software. The Defender of Rights himself admits that the fact of “training” these systems on a small number of black and Arab faces, when they are designed, results in a markedly poorer performance when facing these people. However, the Defender of Rights does not recommend any better ” train »His software, he recommends limiting their use.

A little later in its report, the institution claims to be concerned about the phenomenon of facies checks. ” Levels of trust in the police do not only depend on control itself, but also on whether or not it is seen as racial profiling Writes the Defender of Rights, who even cites statistics showing that black and Arab youth are controlled more than whites. But the Defender of Rights does not see facial recognition technologies as a solution to these human biases.The use of biometric identification and / or evaluation tools by the police could degrade the police / population relationship if it is not surrounded by sufficient guarantees. He says.

Automated emotional assessment: the Defender of Rights brandishes the specter of discrimination in hiring

The Defender of Rights is also alarmed, pages 10 and 13, about the existence of software allowing recruiters to identify, during job interviews, candidates who are “nervous” or otherwise incompatible with the position because of their psychological characteristics. and is concerned that a low grade assigned by such software will reduce such a person’s chances of being hired. But, on page 17, the Defender of Rights makes up for it and says that the use of this type of technology would have been illegal anyway, because it contravenes the imperative of relevance. Indeed, article L122-1-8 of the Labor Code specifies that “ [l]The methods and techniques for assisting in recruiting or evaluating job applicants must be relevant to the purpose pursued “.

More generally, the Defender of Rights recommends the establishment of safeguards. ” The deployment of any biometric device cannot be carried out without satisfying strict conditions of necessity and proportionality having regard to the seriousness of the interference caused. Can we read in the report.