The National Academies of Sciences, Engineering and Medicine published a 105-page report Jan. 17 that concluded the risks of facial recognition technology warranted regulatory action by the government and continued testing.
The committee responsible for the report was co-chaired by University of Wisconsin Chancellor Jennifer Mnookin and Robert E. Kahn professor of computer science and public affairs at Princeton University Edward W. Felten.
The report, sponsored by the Department of Homeland Security and the Federal Bureau of Investigation, provides a detailed examination of FRT in the U.S., including its mechanisms, applications and flaws, and offers several recommendations to the federal government on how to best regulate the technology.
Mnookin’s expertise in the field of evidence law and research in biometrics and forensic sciences undergirded her position as co-chair, according to UW News. Some of Mnookin’s previous research includes numerous studies on fingerprint technology, the use of visual evidence in the American courtroom, and extensive research in the field of forensic science.
SSFC approves The People’s Farm budget reallocation at first meeting of 2024
FRT offers powerful capabilities to law enforcement and other government agencies, according to the report. The tool can be instrumental in developing leads in criminal cases and helping to solve security concerns in large, crowded venues like music concerts.
“At the same time, FRT raises significant equity, privacy, and civil liberties concerns that merit attention by organizations that develop, deploy, and evaluate FRT — as well as government agencies, legislatures, state and federal courts, and civil society organizations,” the report said.
FRT works by connecting an image of a face to other known faces in a database, and generating matches by comparing the similarity of the two. With the leading 2023 face recognition algorithm, 99.9% of searches from a database of 12 million mugshots will return the correct matching entry.
But, the report notes that its accuracy begins to break down when the image is noisy and unclear, like a security camera image of a running suspect.
This flaw is compounded by the fact that even the most current FRT software in the U.S. suffers from unequal match accuracies between different racial and ethnic groups. For example, FRT has been linked to six wrongful arrests, all involving Black individuals.
“These performance differentials have not been entirely eliminated, even in the most accurate existing algorithms,” the report said. “FRT still performs less well for individuals with certain phenotypes, including those typically distinguished on the basis of race, ethnicity, and gender.”
Another issue surrounding the use of FRT is public perception of the technology. For those researching and studying the topic, the shortcomings of FRT are apparent, but for the majority of people who interface with FRT for its convenient applications, the dangers are not so obvious.
UW professor of physics Kyle Cranmer said that while technologies like FRT may seem harmless on a small scale, they can pose civil challenges when catered to a larger audience.
“One of the challenging things here is the gap between the people’s perceptions — which are driven by everyday use of these technologies — and the societal settings where large-scale application of these technologies can be problematic,” Cranmer said. “It is even harder because in absolute terms, the technologies are often very accurate. It’s not a big inconvenience if you have to try again to open your phone or board a plane. But even small error rates can turn into big problems for civil liberties when you deploy these systems at scale.”
UW professor of information security Dorothea Salo said these concerns have implications for both groups and individuals in the U.S. Salo said FRT could potentially alter the way citizens engage with the government on a larger scale.
MPD investigation of property damage to Hillel Foundation building ongoing
“In a practical way, some of our constitutionally guaranteed rights, such as the right to freedom of assembly, are dependent on being more or less anonymous,” Salo said. “Like being part of a crowd and not being able to be picked out of the crowd. That is what facial recognition in public imperils.”
The threat to public assembly and individual liberties have inspired the creation of some city ordinances banning FRT. The Madison City council voted to ban the use of FRT in all internal city agencies in December 2020, with certain exceptions for the Madison Police Department according to Madison General Ordinance 23.64. This move added Madison to a roster of large cities in the U.S. to ban the technology including Boston and San Francisco, according to The Cap Times.
For Salo, the threat this flaw poses to the civil rights of individuals in the U.S. should serve as a warning flag to the impacts of its potential implementation.
“I don’t understand why we’re trusting a demonstratively racist technology,” Salo said. “That seems pretty much the call to halt things right there.”