The Biometrics Institute has said it welcomed the latest research from the National Institute of Standards and Technology (NIST), which offers the biometric community further insights into bias – otherwise known as demographic differentials.Ultimately, biometrics is an extremely accurate but probabilistic technology, incredibly useful for searching large datasets quickly in ways that humans could not achieve. It is constantly being improved through testing including by independent organisations such as NIST. According to NIST, facial recognition technology today is 20 times more accurate than it was just a few years ago.The issue of bias in biometric and AI systems has been a significant focus of public attention recently. NIST's Part 3 of its Face Recognition Vendor Test (FRVT) demonstrates for some algorithms there can be situations where bias can arise.The Biometrics Institute reinforces to its members the importance of knowing the algorithm they are working with. Biometrics Institute members, acting responsibly and ethically, should work with a good understanding of the strengths and weaknesses of the underlying technology. They should also have procedural safeguards and effective oversight in place to govern its application to protect human rights and privacy. In addition, they should consider independent testing of their algorithm.”Biometric technology can be an effective tool to assist in identification and verification in an array of use cases. These range from the convenience of using your face to unlock your phone, to getting through passport control quicker, to the reassurance that a face can be found in a crowd far quicker with the assistance of technology than relying on a human alone. However, when we think of the word bias we tend to consider it as a pre-meditated, closed-minded and prejudicial human trait. It's important to remember that technology cannot behave in this way. So-called bias in biometric systems may exist because the data provided to train the system is not sufficiently diverse. That is why in cases including law enforcement and counter-terrorism the human in the loop – to verify the algorithm's findings – is often a critical aspect of using the technology,” said Isabelle Moeller, Chief Executive, Biometrics InstituteThe Biometrics Institute recognises that, as with any technology, the convenience of its use comes with challenges. It takes human rights, privacy, spoofing, morphing and bias seriously and works diligently with its members, expert groups and the wider biometrics community to provide new, and update existing guidance to mitigate the risks. It provides regular events and training courses around the world to share and grow knowledge.