The Federal Trade Commission (FTC) has demonstrated what penalties companies could be facing if they engage in biometric surveillance practices, deploy substandard technologies, or fail to adopt a comprehensive security program. Rite Aid, a drug pharmacy chain, was handed down a blanket ban on deploying any biometrics surveillance systems for 5 years after serious misconduct over customer’s personal data and privacy. Along with this action, the FTC is serving a proposed order which will set higher expectations for companies thinking about leveraging the benefits of artificial intelligence (AI) for public use cases.

The company is in the full glare of media attention now after the ban was imposed, however, the chain operated this technology for 8 years across 220 stores in a bid to curb shoplifting. FTC’s complaint also indicated that staff relied upon false positive insights generated from automated biometric surveillance systems, ignoring good judgement, to take a narrowed approach to reporting a crime. As a result, the presence of this technology in stores increased bias and disproportionate reporting of women, ethnic and racial groups to the police.

The regulation environment around biometric tech continues to evolve due to finetuning of rules on privacy, while a strong case is made for using biometrics to support policing. 2023 finally brought legislation to end spending on biometric surveillance and make it illegal for any federal agency to “acquire possess, access or use” these systems. Up until the introduction of federal privacy law, regulation mechanisms were murky to navigate in the U.S; this is substantiated by two policies – Consumer Online Privacy Rights Act and the United States Consume Data Privacy Act of 2019.

In 2019, under section 5 of the FTC Act the Commission began tackling facial recognition technologies, whilst regulations did exist per state in the absence of federal privacy laws.

In addition to Rite Aid, other high profile companies like Clearview AI have been ordered to pay huge fines – amounting to $17 million – for committing serious breaches against data protection laws by harvesting thousands of images scrapped from social media sites. Facebook and photo app program, Everalbum were subjects of FTC lawsuits alleging the misuse of facial recognition technologies, after warnings of Section 5 of the Federal Trade Commission Act.

After 5 years the ban will be quashed however Rite Aid must meet certain conditions including the deletion of all data with written confirmation sought from the FTC and developing an effective monitoring program for any system. Companies must be transparent with their customers about data that is captured using CCTV biometrics or at self-checkouts and in the case a genuine crime can be reported.