PULLING BACK THE VEIL: EXPOSING PERNICIOUS USES OF FACIAL RECOGNITION TECHNOLOGY.

Date01 January 2023
Published date01 January 2023
AuthorKim, Christopher Ian
Record Number767861160
AuthorKim, Christopher Ian

INTRODUCTION

Facial recognition is nothing new. Technology giants have been developing and implementing facial recognition for years; our i Phone lock mechanisms are proof of that. The potential uses for facial recognition are limitless and many companies already wield the capability to create powerful tools. Several countries, enticed by the promise of such tools, have jumped at the opportunity to use facial recognition to bolster policing and security forces. The Chinese government serves as a primary example. (1) However, even in the United States, private companies have sold novel artificial intelligence ("AI") technologies to law enforcement agencies who have leveraged them to surveil a largely unsuspecting American public. (2)

The primary blockade for such companies in implementing their technology is privacy. Several facial recognition-based software companies are currently under legal fire for privacy violations. (3) In many of these cases, courts have legitimized privacy concerns by refusing to grant dismissal actions. (4) This year, Apple announced that it will begin scanning personal photo libraries on Apple devices to crack down on child abuse and pornography. (5) Both domestic and international digital rights groups have raised concerns about shrinking privacy rights and the general public's lack of control on the matter. For a company like Apple that has distinguished itself with a platform tailored to keep personal information private, a decision to roll-out an invasive security feature without an opt-out option is bold. (6) This move comes with a tradeoff. Customers must now choose between the peace of mind that their data is for their eyes only (and whoever they choose to share it with) and the very different peace of mind that minors and other vulnerable technology users are protected against bad actors. This note attempts to answer some of the questions that executives at Apple likely mulled over. What is possible with facial recognition? What are the privacy concerns of the general public? What should society's privacy concerns be? Most importantly, how much autonomy are consumers willing to sacrifice in the name of safety and security, efficiency, and a heightened ecommerce experience?

This analysis, broken down into three parts, uncovers a broad lack of public awareness about the ways both governmental and private entities are utilizing facial recognition scans and how such uses may change current conceptions of privacy. Following a brief introduction, Section II discusses various governmental uses of facial recognition technology ("FRT"). This second section primarily focuses on China, where a rapidly developing national surveillance system leveraging FRT aims to combat universal evils like crime and terrorism. Section III explores how global corporations have leveraged facial recognition and AI. Although this paper attempts to segment uses of FRT into separate public and private spheres, such distinctions are not easily drawn. When it comes to FRT, governmental entities and boundary-pushing technology companies feed off each other in a demand and supply market that, without third party intervention, loses sight of pressing privacy issues. Section IV identifies key issues derived from the various case studies in Sections II and III, examines existing privacy regulation frameworks, and attempts to construct a broad, principle-guided regulatory framework specific to FRT.

Ultimately, I argue that corporations and law enforcement agencies distract consumers and citizens with convenient, new products and security mechanisms that promise to ward off crime and terrorism. The problem with these distractions is not that they are not commendable pursuits, but that such pursuits often mask the true intentions and future consequences of FRT utilization, robbing the public of a meaningful voice in the privacy discussion. Building an effective regulatory framework begins with pulling back the veils of consumer benefits and heightened security. Targeted pushes for transparency and accountability ultimately require a level of public awareness that does not yet exist. Rather than proposing technical country-specific solutions, this analysis serves as a primer on the privacy issues posed by the global advancement of FRT and the broad ways of addressing them. My hope is that this paper will spur young and seasoned academics, attorneys, scholars, and other privacy advocates to think deeply about these issues and imagine viable solutions that fit the needs and societal norms of their respective locales.

  1. GOVERNMENTAL USES OF FRT: CHINA & INDIA CASE STUDIES

    By the end of 2022, economists have estimated that the global market for facial technology will surpass seven billion dollars. (7) A very large portion of that market comes from a growing list of countries interested in bolstering existing security mechanisms or building new ones altogether. China is a primary developer and supplier of facial recognition technology. In 2019, Freedom House published a report that found that eighteen countries spanning multiple continents have purchased Chinese FRT. (8) Attached with these purchases is a training program for government officials on how to "better watch their own people." (9) Although nation-state customers are free to use the technology how they see fit, China's national surveillance framework has undoubtedly provided inspiration for other nation states. With an estimated 200 million security cameras (all of which are now equipped with FRT), China accounts for the largest share of installed security cameras globally. (10) The United States is second in terms of the overall number of cameras, but when population differences are accounted for, both countries have roughly one security camera for every four citizens. (11)

    In recent years, the Chinese government has made artificial intelligence development a focal point. The country's "National AI Team" consists of government-selected technology companies who are granted special access to public databases and repositories. (12) This public-private partnership aims to streamline innovation and harness new technologies for predetermined government-minded goals. (13)

    Each company is designated a specific sector and is meant to work with the public office for that sector. (14) For example, iFlytek cooperates with courts to create a more efficient case-handling system. (15) Other companies work with China's education department to incorporate AI in school curricula. (16) SenseTime, a large AI company, handles the collection of data from China's CCTV surveillance cameras. (17) In 2018, the company provided Chinese police forces with high-powered glasses equipped with FRT that connects to China's state criminal database, allowing police officers to identify potential suspects in real time. (18)

    At face-value the use of FRT in policing seems laudable, (19) but human rights advocates and academics have expressed concerns about the specific ways the Chinese government has utilized the technology. In 2019, FRT was trained by government officials to identify individuals of Uighur descent, a largely Muslim minority group primarily located in the western Xinjian region, purely based on physical attributes. (20) China sparked wide international condemnation when law enforcement officials, in a self-proclaimed fight against threats of terrorism, imprisoned hundreds of thousands of Uighur nationals and were accused of destroying mosques in an effort to suppress the religious group. (21) Allegations of forced hard labor, forced sterilizations, and overall mistreatment resulted in increased criticism of China's use of FRT in furthering what some world leaders have called a genocide. (22)

    Clare Garvie of Georgetown Law's Center of Privacy and Technology has said, "if you can make a technology that can classify people by ethnicity then people will use it to repress that ethnicity." (23) Many regulators in other developed nations share Garvie's view. Article 9 of the EU's General Data Protection Regulation (GDPR) specifically bans the collection of data that identifies subjects through biometric data or data based on ethnicity, race, sexual-orientation, political opinions, philosophical beliefs, trade union membership, or religion without clear and explicit consent. (24) In the United States, the state of Illinois recently passed the Biometric Information Privacy Act ("BIPA") with similar goals and concerns in mind. (25) Both Texas and Washington followed suit soon after with their own laws restricting biometric data collection. (26)

    FRT's propensity for misidentification compounds growing concerns. Independent studies from the National Institute of Standards and Technology and the Gender Shades Project highlighted FRT's propensity for misidentification of certain demographics. (27) For instance, Amazon's patented Rekognition system demonstrated a tendency to misidentify women with darker-skin (31% error in biological sex classification). (28) Apple faced backlash when reports surfaced that its facial recognition screen-locking mechanism could not differentiate between eastern Asian faces as well as other demographics, resulting in weaker security for those individuals. (29) Chinese CCTV footage is often affected by weather and lighting conditions. Furthermore, China's AI relies on input of police officers to "teach" the system how to categorize humans based on "social definitions of ethnicity." (30) Human input inherently introduces the possibility of bias, which is particularly problematic in policing systems accused of racial bias or prejudice.

    In 2016 Detroit, Michigan orchestrated a large-scale surveillance program called Project Green Light (PGL). (31) High-definition cameras, equipped with FRT, were installed throughout the metropolitan area and were linked to Detroit Police Department criminal databases. PGL cameras were distributed more heavily in...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT