After three years at the development phase, ISO/IEC 19795-10, the new ISO standard on how to run tests for technological “bias” in biometric system, is ready to be put before the committee.
The NIST biometric accuracy test is one example that determines how vendor solutions rank in terms of accuracy and the predisposition to physically recognise individuals of certain demographics over others.
Feedback on the ISO standard are due from national standards bodies by 16 May, 2023 with the input of multiple international experts who have contributed so far.
The announcement was made on LinkedIn by John Howard, Principal Data Scientist at the Maryland Test Facility. Global vendors are mitigating the low level of bias in technologies to improve accuracy.
Giving a presentation on the standard, fairness and demographic differentials at IFPC 2023, fairness models were discussed as an active area of research in the broader AI community. They also spoke about the inherent complexities ensuring demographic fairness in biometrics and understanding why a level of technical “bias” occurs which NIST and other standard tests have successfully mitigated in most vendor solutions.
Howard touched on the numerous regulations being proposed or implemented across Europe, the U.S. Australia and the UK which govern AI and facial recognition technologies, however as of 2022 there is no standard to measure fairness or inherent discrimination within biometric systems.
The standard can be found here: https://www.iso.org/standard/81223.html