One failing of biometric technology can be attributed to algorithm bias, which can commonly occur due to the immaturity of technologies. This can cause varying match results across different demographic groups.

Whilst the infancy of biometric technologies is one explanation for false match rates, it nonetheless raises some ethical concerns around racial unfairness.

The Racial Justice Network (RJN) is an anti-racism charity that seeks to eliminate all forms of racism in treatment towards black people, and even in technologies that could be interpreted as favouring certain demographics. While biometrics is designed to streamline the user experience, bias in biometric technology has the potential to be a vehicle for discrimination and places both physical and metaphorical barriers in the way of black people.

A specific report on bias in biometric technology tackles the common use of facial recognition in routine policing practices, which raises concerns over privacy and the use of invasive technologies.. It also feeds into an uncomfortable truth about the relationship between the black community and the police, which has long been accused of unfairly treating black people in relation to crimes.

The freedom of the police to use biometric technology at will in investigations exacerbates the “worrying racial disparities we already see in policing”. The reports calls to abolish unacceptable norms in policing to stop and search black people, who may be forced to provide a photograph which is used to find matches on the policing database.

The Facial Recognition (OIFR) App should be used in circumstances when someone cannot confirm their identity because they are deceased or have mental health, age or medical barriers. It is also permitted if the subject refuses requests to provide or provides false information about their identity.

In a study to assess racial and sexist bias in FRT algorithms, the report cities the National Institute of Standards and Technology “analysed 189 facial recognition algorithms submitted by 99 developers” which found that the majority were substantially less likely to correctly identify a Black woman than a member of any other demographic.

The findings of Big Brother Watch (2022) revealed that over 3,000 people were falsely identified by police biometrics, which was 87% inaccurate (2016-2022).

The Racial Justice Network uncovered that South Wales Police was the only police force in England and Wales to pilot use of the facial recognition mobile app (Operation Initiated Facial Recognition) in policing procedures, and worryingly, black people are 4 times more likely to be stopped and scanned than a white person.

(Loyola-Hernández, L., Coleman, C., Wangari-Jones, P. and Carey, J. (2022) #HandsOffOurBiodata: Mobilising against police use of biometric fingerprint and facial recognition technology, the Racial Justice Network and Yorkshire Resists, UK.)