This article reprints Monument Advocacy's memo on House Committee on Homeland Security Hearing on DHS Use of Facial Recognition and other Biometric Technologies (Part II).On Thursday, February 6, 2020, the House Committee on Homeland Security held a hearing entitled, “The Department of Homeland Security's Use of Facial Recognition and other Biometric Technologies, Part II.” Chairman Bennie Thompson (D-MS-2) acknowledged the value of facial recognition technologies as a tool for homeland security, but expressed his deep concerns about racial, gender, and age biases in facial recognition algorithms. He also raised the issue of data security, noting that a Customs and Border Protection (CBP) contractor experienced a significant data breach in 2019 that led to the theft of U.S. travelers' photographs. Ranking Member Mike Rogers (R-AL-8) stated that CBP has demonstrated the capability of biometrics to improve security, facilitate travel, and better enforce existing immigration laws and regulations. Moreover, he asserted that, based on a December 2019 report published by the National Institute of Standards and Technology (NIST), facial recognition algorithms adopted by the Department of Homeland Security (DHS) have no statistically detectable race, age, or gender bias. Mr. John Wagner, a senior official with CBP, testified that CBP is using a facial recognition algorithm from one of the highest performing vendors identified in the NIST report at assessing demographic-based error rates in facial recognition algorithms. Wagner also stated that CBP's use of facial recognition technologies is a facet of its congressionally mandated biometric entry/exit system that is modernizing travel and border security. The second witness, Peter Mina, a Deputy Officer with DHS's Office for Civil Rights and Civil Liberties (CRCL), voiced the CRCL's recognition of the potential risks of impermissible bias in facial recognition algorithms and detailed how CRCL is working to minimize such risks. Finally, Dr. Charles Romine walked the committee through the results of the NIST report, explaining how it quantified the effect of age, race, and sex on face recognition performance. The tone and focus of witness questioning were largely split along partisan lines. Chairman Thompson (D-MS-2) and the Democratic committee members probed CBP's and DHS's processes for safeguarding facial recognition data, ensuring facial recognition screening accuracy, and preventing bias in the screening process. They also highlighted the NIST report's conclusion that certain facial recognition algorithms had higher rates of inaccuracies when testing African and Asian Americans. On the other hand, Ranking Member Mike Rogers (R-AL-8) and the Republican committee members concentrated primarily on various national security and travel benefits of facial recognition testing, as well as CBP's plan to adopt an algorithm that had no detectable demographic bias in the NIST report. WITNESSES Mr. John Wagner, Deputy Executive Assistant Commissioner, Office of Field Operations, U.S. Customs and Border Protection, U.S. Department of Homeland SecurityMr. Peter Mina, Deputy Officer for Programs and Compliance, Office for Civil Rights and Civil Liberties, U.S. Department of Homeland SecurityCharles Romine, Ph.D., Director of the Information Technology Laboratory, National Institute of Standards and Technology, U.S. Department of Commerce MEMBERS OPENING STATEMENTS Chairman Bennie Thompson (D-MS-2)I want to reiterate that I am not fully opposed to the use of facial recognition technology, as I recognize that it can be a valuable tool to homeland security, but I remain deeply concerned about privacy, data security, and the accuracy of this technology and want to ensure those concerns are addressed before DHS deploys it any further.NIST published a report that confirmed age, gender, and racial bias in facial recognition algorithms. For example, certain algorithms misidentified Asian and African Americans faces 10 to 100 times more often than white faces. These and other findings suggest that certain facial recognition technologies are not ready for widespread, department-wide use. Misidentifying even a small percentage of the public could affect thousands of travelers annually and disproportionately affect certain individuals, including people of color.Data security remains an important concern. Last year, a CBP contractor had a significant data breach which led to the theft of travelers' images. I hope to learn what steps CBP is taking to ensure data protection. Ranking Member Mike Rogers (R-AL-8)After the tragic events of September 11th, Congress recognized that biometric systems are essential to our homeland security. Following the recommendation of the 9/11 Commission, Congress charged DHS with the creation of an entry/exit biometric system. CBP has already demonstrated the capability of biometrics to improve security, facilitate travel, and better enforce existing immigration laws and regulations.The government and private sector have made enormous strides in the accuracy, speed, and deployment of biometric systems. These advances in facial recognition algorithms in particular are transformational.If the majority had taken the time to read the full NIST report before tweeting about it, they would have found the real headline: NIST determined that facial recognition algorithms adopted by DHS have no statistically detectable race or gender bias. According to the critical facts, the facial recognition technology used by DHS is not racially biased, does not harm civil rights, it is accurate, and most importantly it does protect the homeland. WITNESS OPENING STATEMENTS Mr. John Wagner, CBPSince CBP is using an algorithm from one of the highest performing vendors identified in the NIST report, we are confident that our results are corroborated with the findings of this report. Moreover, this report indicates that there are a wide range of performance of the 189 algorithms that NIST reviewed, and the highest-performing algorithms had minimal to undetectable levels of demographic based error rates.The report also highlights some of the operational variables that impacts error rates, such as gallery size, photo age, photo quality, camera quality, lighting, numbers of individuals in the gallery, or human behavior factors. That's why CBP has carefully constructed the operational variables in the deployment of the technology to ensure that we can attain the highest levels of match rates which remain in the 97-98% range.As we build out the congressionally mandated biometric based entry/exit system, we are creating a system that not only meets the security mandate, but also in a way that is cost-effective, feasible, and facilitative for international travelers.Several existing laws and regulations require travelers to establish their identity and citizenship when entering and departing the U.S. CBP employs biographic and biometric based procedures to inspect the travel documents proffered by individuals to verify the authenticity of the documents and determine if they belong to the actual person presenting them. The use of facial comparison technology simply automates a process that is often done manually today.Our private sector partners, the airlines and airports, must agree to documented specific CBP business requirements if they are submitting photographs to CBP as part of this process. These requirements include a provision that these photographs must be deleted after they are transmitted to CBP and may not be retained by the private stakeholder. Mr. Peter Mina, DHS CRCLCRCL recognizes the potential risks of impermissible bias in facial recognition algorithms, as previously raised by this Committee. CRCL supports rigorous testing and evaluation of algorithms used in facial recognition systems to identify and mitigate impermissible bias. CRCL will continue to support the collaborative relationship between NIST, the DHS Science & Technology Directorate, and so forth.I would like to make three overarching points in my testimony today: 1) the Office for Civil Rights and Civil Liberties (CRCL) has been and continues to be engaged with the DHS operational Components to ensure use of facial recognition technology is consistent with civil rights and civil liberties law and policy; 2) operators, researchers, and civil rights policymakers must work together to prevent algorithms from leading to racial, gender, or other impermissible biases in the use of facial recognition technology; and 3) facial recognition technology can serve as an important tool to increase the efficiency and effectiveness of the Department's public protection mission, as well as the facilitation of lawful travel, but it is vital that these programs utilize this technology in a way that safeguards our Constitutional rights and values.To achieve these three points, CRCL: (1) influences DHS policies and programs throughout their lifecycle; (2) engages with department offices and components in the development of new policies and programs to ensure the protection of civil rights and civil liberties are fully integrated into their foundations; (3) monitors operational execution and engages with stakeholders in order to provide feedback regarding the impacts and consequences of department programs; and (4) investigates complaints and makes recommendations to DHS components, including allegations of racial bias in the screening process.As these and future projects develop, CRCL will remain engaged with advocates, technologists, experts, and Congress to ensure that civil rights and civil liberties protections are effective and sufficient. Dr. Charles Romine, NISTNIST Interagency Report 8280, 5 released on December 19, 2019 quantifies the effect of age, race, and sex on face recognition performance. It found empirical evidence for the existence of demographic differentials in face recognition algorithms that NIST evaluated. The report distinguishes between false positive and false negative errors, and notes that the impacts of errors are application dependent. NIST conducted tests to quantify demographic differences for 189 face recognition algorithms from 99 developers, using four collections of photographs with 18.27 million images of 8.49 million people.I will first address one-to-one verification applications. There, false positive differentials are much larger than for false negatives and exist across many, but not all, algorithms tested. False positives might present a security concern to the system owner, as they may allow access to impostors. False positives are higher in women than in men and are higher in the elderly and the young compared to middle-aged adults. Regarding race, we measured higher false positive rates in Asian and African American faces relative to those of Caucasians There are also higher false positive rates in Native American, American Indian, Alaskan Indian, and Pacific Islanders. However, a notable exception was for some algorithms developed in Asian countries. There was no such dramatic difference in false positives in one-to-one matching between Asian and Caucasian faces for algorithms developed in Asia.I will now comment on one-to-many search algorithms. Again, the impact of errors is application dependent. False positives in one-to-many search are particularly important because the consequences could include false accusations. For most algorithms, the NIST study measured higher false positives rates in women, African Americans, and particularly in African American women. However, the study found that some one-to-many algorithms gave similar false positive rates across these specific demographics. Some of the most accurate algorithms fell into this group. This last point underscores one overall message of the report: Different algorithms perform differently. Indeed, all of our FRVT reports note wide variations in recognition accuracy across algorithms, and an important result from the demographics study is that demographic effects are smaller with more accurate algorithms. QUESTION & ANSWER Chairman Bennie Thompson (D-MS-2)Dr. Romine, part of what you said is that how effectively facial recognition technology is deployed depends on the application of the technology. Explain that a little more to the committee.Dr Romine: Our approach is to be cognizant of the risk that is associated with facial recognition technology deployment. The studies we do help to inform policymakers, as well as operators of these technologies, about how to quantity these risks for the facial recognition algorithms themselves. The deployed systems have other risks associated with them that we don't test. That risk that comes from the error rates associated with the algorithm is part of a much larger risk management that the operators have to undertake.Mr. Wagner, can you share with the committee the extent that CBP goes to protect the information collected in this process?Mr. Wagner: The photographs that are taken by one of our stakeholders' cameras are encrypted; they are then transmitted securely to the CBP cloud infrastructure, where the gallery is positioned. The pictures are templatized, which means they are turned into a type of mathematical structure and cannot be reverse engineered. They are then matched with the templatized photos that we have pre-staged in the gallery. The comment that 2 to 3 percent of people who are misidentified – what is CBP doing to get that to zero?Mr. Wagner: That does not mean they are misidentified. It just means they were not matched to a picture in the gallery that we did have of them. That's where we need to look at operational variables, such as the camera quality, the picture quality, the lighting, and the age of the passport photo.Mr. Mina, have you received any complaints from citizens about this technology?Mr. Mina: We have received one complaint that references facial recognition technology. However, we have not seen a trend of complaints, which would trigger an investigation. In this matter, we are working on the policy side, advising CBP directly. We also hear from the public at community engagement roundtables around the country. We have heard concerns from those forums, and we're using those concerns to shape our advice.Dr. Romine, let's be clear, you said that Africans and Asians have been misidentified, correct?Dr. Romine: In the highest performing algorithms, we do not see that to a statistical level of significance. For one-to-many identification algorithms. For the one-to-one identification algorithms, we do see evidence of demographic errors for African and Asian Americans, as well as other minorities. Ranking Member Mike Rogers (R-AL-8)There is no racial bias in facial recognition technologies used by DHS. Is that an accurate statement?Dr. Romine: In the highest performing algorithms for one-to-many matches, we saw undetectable bias in the demographic differentials that we were measuring.Did you test the NEC3 algorithm being used by DHS?Dr. Romine: We tested algorithms from NEC. We have no independent way to verify that that is the specific algorithm that is being used by CBP. That is something that CBP and NEC would have to attest to.Mr. Wagner, is CBP using NEC3 algorithms?Mr. Wagner: We are using an earlier version of NEC right now and we are testing NEC3 which we plan to upgrade to in March.Dr. Romine, who can participate in a facial recognition vendor test? Is it accurate to say that some algorithms are far less accurate and sophisticated than others?Dr. Romine: Yes, anyone around the globe can participate. We have participants from biometrics industries from around the country, some from universities, some from experimental Representative Elissa Slotkin (D-MI-8)Dr. Romine, just to be clear, in a certain segment of these algorithms, there is some evidence that they have higher rates of mistakes for African and Asian Americans. Is that correct?Dr. Romine: It is correct that most of the algorithms in the one-to-many do exhibit those differentials. The highest-performing algorithms do not.Mr. Wagner, walk me through the process of what it would be like to be misidentified at the border.Mr. Wagner: You would then just show your passport and an officer would manually review it. What we are matching people against is an electronic copy of their passport photo.What are the results of the implementation of facial recognition technology at airports and our southern border? Tell me some statistics to demonstrate the value of these programsMr. Wagner: We have run 43.7 million people through to date. We have caught 252 imposters – people with legitimate documents that belonged to someone else. 75 of those were U.S. travel documents. The only biometric data on a U.S. passport is that person's passport photograph.Out of the 43.7 million people who have gone through, how many negative hits were there?Mr. Wagner: Our match rate is about 97 to 98 percent. That two to three percent means that we were unable to find the person in our preassembled gallery – we didn't match against the wrong person.How many were false positives?Mr. Wagner: I am not aware of any. There may be a small handful. Representative Michael McCaul (R-TX-10)I believe that technology is our friend in stopping terrorists and bad actors from entering our country. My understanding is that the entry/exit program, American citizens can opt out of that program, is that correct?Mr. Wagner: Yes, that is correct.Can you comment on why the Biometric Identification Transnational Migration Alert Program is so valuable to the interests of the U.S. and the American people?Mr. Wagner: It's critically important. People change their biographic details and most of our watchlist searches are biographically based. Biometric identification adds a heightened level of security. Representative Yvette Clarke (D-NY-9)Right now, in terms of facial recognition, regulation is still in the wild west. Meanwhile, facial recognition technologies are routinely misidentifying women and people of color. Although there are some promising applications for facial recognition, these benefits don't outweigh the risk of automating discrimination.We have seen passengers, and particularly darker skin passengers, not able to be matched due to poor lighting and other factors. Does CBP track how often its systems fail to capture photos of sufficient quality for matching?Mr. Wagner: We don't own all the cameras, so it's difficult for us to track the cameras airlines are using and the quality of photos they are taking. We are tracking how many pictures we receive and what our match rates against them are. We have a certain standard of quality required for photos to be submitted. We do not know how many attempts at taking a good quality pictures are made.Mr. Wagner, at a future date, please provide the committee with a letter detailing the steps that CBP is taking to ensure it can capture high quality images of travelers for successful facial recognition screening. Representative Clay Higgins (R-LA-3)Does NIST know the identity of the vendor when you are testing their respective facial recognition algorithm?Dr. Romine: Yes, the identity of the vendor is self-identified.Is there a wide variance between top performing algorithms like the one CBP uses and low performing ones, say used by university projects?Dr. Romine: The top-performing algorithms are significantly better. However, I have no way to independently verify that CBP is using a top-performing algorithm.Mr. Wagner, can you confirm that CBP using one of the top performing algorithms?Mr. Wagner: We're not using the algorithm they tested but we will soon upgrade to one of the top-performing algorithms. Representative Donald Payne (D-NJ-10)Where is all this facial recognition data stored? Under what specific circumstances is this data allowed to be shared or transferred?Mr. Wagner: We are using travel document databases. These are photographs, collected by the US government for the purposes of putting on a travel document, such as a passport or visa. These photos constitute the baseline gallery that we match against with our data recognition technology. New photographs we take of a person, those photos are discarded after 12 hours. If you are a foreign national, your photos are sent to a system called IDENT that DHS runs where they are stored under the protocols of the systems of record notice of the data retention period which I believe is 75 years. Representative Mark Walker (R-NC-6)Is it true that a biometric entry/exit system uses less personally identifiable information than the system we currently have in place?Mr. Wagner: Yes, currently, you have to open your passport booklet and show it to an individual either to check your bags, board the plane, getting through TSA, etc. All your passport information, including date of birth, place of birth, and so forth, are presented to an individual who does not require all of the at information. Your sharing less information with facial recognition technology.Are there any nefarious individuals, say involved in child trafficking, who have been caught because of this facial recognition technology?Mr. Wagner: On the land border, we have caught 247 imposters, 18 of those were under the age of 18. 73 of those at the land border had US passports, and 46 of them, almost 20%, has criminal records that they were trying to hide.Were these individuals caught strictly because of the facial recognition technology?Mr. Wagner: Our officers are also very good at identifying behaviors. The technology on top of the officers' skills and abilities creates maximal security. Rep. Dina Titus (D-NV-1)How do you work to coordinate using this for security and also reducing wait times, as opposed to making passengers' experiences more difficult?Mr. Wagner: Facial recognition aligns our national security goals with our travel and tourism goals as well. It makes a more convenient, consistent passenger experience. We're seeing reduced waiting times.