Should we be using biometrics to conduct mass surveillance operations when the most targeted individual is likely to be a Beyonce fan at her concert?
The use of biometric technologies even juts into use cases like Beyonce concerts or to tighten security at the Silverstone FI race track. When rumours began circulating about leveraging facial recognition at the 2024 Paris Summer Olympics, President of France’s data protection authority, Marie-Laure Denis, expressed her rejection strongly as to allow the “identification of people on the fly in the public space”.
If criminals likely choose to file between the crowds, it is a huge risk for any police or immigration taskforce to balance with people’s right to privacy.
South Wales Police, who admitted to utilising biometrics to support better policing practices to catch prolific criminals, focused on the surveillance of terrorists and paedophiles potentially hidden among the crowds at a Beyonce concert where many young children were in attendance. The justified operation was in keeping with laws on privacy and adhered to deleting all data captured after 31 days.
Solution providers such as IDEMIA are increasingly meeting the standards set by NIST – National Institute of Standards and Technology – to mitigate bias and adhere to privacy and security factors.
Some police forces are ramping up rollouts of controversial new software which enable officers to take photos on their mobile phones to search for a database match using facial recognition,.
Plans to deploy a controversial app called Operator Initiated Facial Recognition (OIFR) could be realised nationally by next year, said National Police Chiefs’ Council, and three forces have already begun trialling the new technology.
The trial of iPatrol, the facial recognition app, by South Wales and Gwent police only lasted 3 months with significant criticism now directed towards those forces that would allow disproportionate bias and privacy failings.















