August 31, 2020 - The U.S. Congress continues to examine facial recognition technology (“FRT”) with a view towards passing legislation.  On August 11, 2020, the Government Accountability Office (“GAO”) publicly released a report submitted to Congress on July 13 entitled “Facial Recognition Technology:  Privacy and Accuracy Issues Related to Commercial Uses.”  The GAO report provides important insights into the possible direction of future Congressional action.

Expanding Use of FRT in the Commercial Sector

The GAO report confirms that, as we have reported, more and more businesses are using FRT.  According to the report, revenue from the global FRT market is expected to double from $3-$5 billion in 2016-2019 to $7-$10 billion in 2022-2024.  Other markers also point to rapid growth in FRT usage and development.  From 2015 to 2019, the annual number of new FRT patents issued in the U.S. increased from 631 to 1,497, and from 2013 to 2019, the number of developers that submitted FRT to the National Institute of Standards and Technology (“NIST”) for testing each year increased from 16 to 99.

Various factors are spurring this growth.  First, the technology is improving.  The use of new machine learning algorithms, called deep neural network technology, has led to greater speed and accuracy.  Second, costs are decreasing.  Among other things, FRT systems are moving to the Cloud, which generally reduces costs for end-users.  Third, consumers are becoming more comfortable with FRT, particularly in smartphones and other everyday devices.  Consumers still have privacy concerns, which may be holding back even greater market growth, but regulation and the passage of time is expected to further ameliorate these concerns.

Privacy Concerns

Any new federal legislation will have to address privacy concerns.  The GAO observes that most FRT systems rely on large data sets of facial images, which are often collected without the individuals’ consent.  One particular method, called web scraping, uses software to collect facial images and personal information from social media, networking and other internet sites.  FRT startup Clearview AI, Inc. has allegedly scraped together a data set of 3 billion facial images from millions of websites without the consent of either the individuals or the websites, and is facing lawsuits in multiple states for violating state privacy laws.

Another issue is the onward sale of the assembled data set.  Companies called “data brokers” or “information resellers” collect and sell personal information to third parties.  There are also “data consultants” that assist client companies with identifying data needs, analyzing existing data and collecting new data.  These sales usually occur behind the scenes, without the consent or knowledge of the individual data subjects.

Performance Differences across Demographic Groups

The GAO report addresses the controversial issue of alleged discrimination, noting that NIST and others have found that many FRT systems perform differently across certain demographic groups.  This appears to be more a question of accuracy, rather than any actual bias or discriminatory intent, as evidenced by the fact that NIST has recently found significant improvements in accuracy.  The improvements are expected to continue, and there are already algorithms that can achieve accurate performance across all demographic groups.

That said, many algorithms still show performance differences.  In 2019, NIST tested 189 mostly commercial algorithms from 99 developers.  In general, white males had the lowest false positive rate, while black females had the highest.  In some algorithms, the discrepancy was significant; the worst-performing algorithm was 100 times more accurate for white males than for black females.  While the consequences of such a discrepancy may not be as dire in the commercial context as it is in the law enforcement context, there can still be problems.  For example, a misidentified person could be denied access to a building or an online account, or thought to be a shoplifter.

The GAO report also outlines potential measures that can be taken to mitigate performance differences.  These include (1) using larger and more representative training and testing data sets; (2) improving image quality by exercising better control over lighting and camera settings; (3) focusing development on achieving equal error rates across demographic groups; (4) collecting performance feedback to improve algorithms and data sets; and (5) engaging in periodic testing by independent evaluators, which could be made mandatory.

Current Privacy Protections

Currently, the U.S. does not have a comprehensive federal privacy law governing the collection, use or sale of personal information by private-sector companies.  In addition, no federal law expressly regulates the commercial use of FRT, including the identifying and tracking of individuals.  Federal laws addressing privacy issues, some of which potentially apply to FRT, are generally tailored to specific purposes, situations, or sectors.  These laws include the Driver’s Privacy Protection Act, the Health Insurance Portability and Accountability Act and the Children’s Online Privacy Protection Act.

As we have previously reported, a number of states regulate FRT, directly or indirectly, and these state statutes may inform federal lawmaking efforts.  Examples of relevant laws include the Illinois Biometric Information Privacy Act, the Texas Statute on the Capture or Use of Biometric Identifiers and the California Consumer Protection Act.  Also, various states have specifically included biometric data in their data breach notification laws.  (See our prior alert on this topic here.)

Conclusion

The GAO report reiterates the recommendation from a 2013 report that Congress strengthen the current consumer privacy framework to reflect the effects of changes in technology and the marketplace.  Companies developing and using FRT need to keep a watchful eye on these legislative efforts—as well as new state laws—and plan accordingly.  It seems all but certain that the commercial use of FRT will continue to grow as the technology improves, costs decrease, and consumers become more comfortable with it, and it is just as likely that there will be privacy legislation that will govern FRT development and use.

For further information, please contact: 

Seth D. Rothman | Partner 
Hughes Hubbard & Reed LLP
One Battery Park Plaza | New York, NY 10004-1482
Office +1 (212) 837-6872 | Cell +1 (917) 697-8093
seth.rothman@hugheshubbard.com | bio

Paul Marston | Counsel
Hughes Hubbard & Reed LLP
Kojimachi Place, 9th Floor, 2-3 Kojimachi | Chiyoda-ku, Tokyo 102-0083, Japan
Office +81-3-6272-5831 | Cell +81-80-8432-3497
paul.marston@hugheshubbard.com | bio