IDEMIA, a leader in future-proof identity technologies, has ranked the highest in a new Face Recognition Vendor Test to determine the level of bias that exists using biometric technology across different demographic groups.

IDEMIA’s technology demonstrated an even number of false biometric matches between people from different demographics, proving a low level of bias compared with other providers.

Bias in Biometrics 

The test for ‘fairness’ in a technology differs slightly from the term ‘accuracy’, referring to a weakness in biometric technology to detect and match some facial appearances but perform well on other occasions. Technology from all providers can have a high level of accuracy in capturing a live facial image and matching it with photographic data of a person, however still create bias which is known to be caused by a lack of diverse demographic data, bugs, and inconsistencies found in the algorithms.

The technology is not inherently biased or discriminate based on facial appearance – it is the design of biometric technology and algorithms which specialists are still working to enhance to mitigate any “discrimination”.

The collection of training data is often the root cause of bias in technologies. If the data is not sufficient to represent all demographics this can have an impact on live detection accuracy and trust of the user.

“When decisioning is biased, the models are being trained with the wrong data”, says Mitek. Providers like IDEMIA and Mitek have prioritised the collection of balanced and representative data to eliminate potential bias.

Mitigating bias 

In conversation with biometricupdate.com, digital identity specialist, Stephen Ritter, Chief Technology Officer at Mitek, explained how the industry by-large was achieving inclusivity with biometrics which is at the touchpoint of being implemented across industry use cases. He argues the point that biometric technologies work a lot of the time to verify users, which 44% of consumers say is “very” or “extremely” effective building their trust in financial services.

He said: “One of the beautiful things about multimodal biometric authentication is that it’s a passive technology; the user is not asked to do anything extra or out of the ordinary”, impacting trust.

The variables considered to affect bias in biometric technologies, and thus impact user trust, are age, gender, ethnic and racial background.

Alexey said: “In terms of identifying transgender people correctly, out of hundreds of millions of authentications, there has not been a single report of our system discriminating against this (Mitek) group. Our effort to build an unbiased system that works in a multi-dimensional environment has been effective”.

IDEMIA’s score

The Idemia-009 algorithm achieved a fourth-best score in the test in the Mugshot, Border Photos and Kiosk Photos categories. To achieve this result, it is clear that IDEMIA invests in quality and fair data sets in the training and testing of its biometric solutions. The score reflected on its company focus to eliminate bias and represent all demographics.

Speaking about IDEMIA’s ranking, CTO Jean-Christophe Fondeur, said: “NIST’s FRVT results are further evidence of the highest standard we have set with our suite of facial recognition technologies, positioning fairness as a key criterion, in addition to accuracy”.

“By being more than twice the fairness of the top 20 most accurate, we continue to lead the industry in terms of social responsibilities. Idemia is paving the way in the ‘battle for fairness’ and I would like to congratulate our teams of experts on their excellent work meeting this priority”.

IDEMIA also ranked highly in the biometric verification accuracy tests, maintaining its position in the top 3 with 99.88% accuracy against a 12 million dataset. IDEMIA secured a top place for single-eye accuracy and first ranking for its one-to-one fingerprint recognition.