Facial Recognition Technology and Its Application in Educational and Other Sensitive Settings
Facial recognition technology (FRT), which measures an individual’s face to automate identity verification, dates back to the 1850s, when England introduced prison photography to track escaped prisoners. Now, sophisticated forms of FRT are used to unlock cellphones via facial identification features, by TSA agents to screen travelers and for policing sports stadiums and casinos. Outside the U.S., FRT has been implemented in K-12 and university classrooms. Although U.S. schools and other sensitive settings have not yet adopted these practices to the same degree, there is a push by some to expand use of FRT, sparking debate around the costs and benefits.
Though FRT is not widespread in U.S. classrooms, there are some states which have implemented the technology. For instance, an Arizona K-12 district recently installed an AI-powered security system on several Navajo school campuses. Aside from security applications, online proctoring services were utilized by schools throughout the U.S. during the COVID-19 pandemic. These services – which required use of a webcam to scan students’ rooms – proved useful for exam administration, but garnered criticism regarding invasions of student privacy and assumptions of cheating leading to unwarranted discipline.
In China, however, FRT has been more heavily used in K-12 and university classrooms. These “intelligent behavior management systems” collect, store and respond to student facial expressions during lessons. Some of this software has misdiagnosed children by mischaracterizing stand-alone facial expressions as evidence of social disorders or lack of attention. Due in part to privacy concerns, hundreds of the surveillance channels which livestreamed camera footage from Chinese classrooms were shut down in 2017, but other facial recognition systems continue to flourish in China and India.
One U.S.-based proponent of FRT is Chafic Bou-Saba, an associate professor for computing technology and information systems at Guilford College. He is currently developing an FRT system similar to that used in China, as a potential solution for faculty members who may not be able to effectively monitor distracted students. Like the FRT used in China, Bou-Saba’s technology would place multiple cameras throughout a classroom and record five- to ten-second videos in order to track attendance and document student behavior, attention and emotional states (i.e., boredom, distraction, or confusion). And BrainCo. – a startup based in Somerville, Massachusetts and funded by Harvard University – has developed a potential companion mind-reading technology, i.e. monitoring brain waves, which has been tested on students in China. Once Pandora’s box has been opened, the idea of a college admissions officer taking note of an applicant’s third-grade daydreaming episode requires little stretch of the imagination.
A report by the National Academies of Sciences, Engineering, and Medicine recently cautioned that facial recognition technologies exist in a largely unregulated ecosystem and that advancements have “outpaced laws and regulations, raising significant concerns related to equity, privacy and civil liberties.” Erik Learned-Miller, chair of the faculty and professor of computer science at the University of Massachusetts at Amherst, is similarly wary of FRT use in learning environments, stating that “it’s not fun to be under the microscope all the time. You have to be very careful of the stress that surveillance puts on students.” Furthermore, the long-term and perhaps greater concern is that students may become too accepting of sweeping forms of surveillance. To combat this, students ought to be given the opportunity to opt out of FRT use and be sure that they will not be penalized for doing so. Moreover, parents should remain wary of programs involving facial recognition, and ensure that all involved parties know how and why their data is being collected.
Technology scholars like Kathleen Creel, assistant professor of philosophy and computer science at Northeastern University, have commented on the non-consensual aspect of facial recognition. Often, image datasets — which feature the faces of individuals and are used to train FRT — have been collected from security cameras or from the Internet without consent. More than 3,100 U.S. agencies — chief among them, the FBI and Department of Homeland Security — have used Clearview AI, a technology which was trained on billions of images obtained without consent. Although the tool can be more reliable than other government databases, civil rights advocates have argued that FRT use against individuals not accused of crimes, puts people in a “perpetual police lineup.”
Within the university context, researchers at Duke University recorded the faces of thousands of students for their own image dataset as early as March 2014. These likenesses were made available on a public website without their knowledge or consent. Among those that downloaded the likenesses were academics, security contractors and military researchers around the globe. In 2019, Duke’s data set collection was made public in media reports. According to Duke’s Michael Schoenfeld, vice president for public affairs and government relations, the dataset had not been collected nor made available to the public as required by the university’s Institutional Review Board (“IRB”), which reviews all studies involving human subjects at the university. Following the IRB investigation, the data set was removed.
In an attempt to protect student privacy, Arkansas, California, Illinois, Texas and Washington have each enacted laws concerning FRT. These laws seek to regulate the “collection, use, safeguarding and retention” of biometric information (i.e. “measurable physical, physiological, or behavioral characteristics that are attributable to a person… that can be used to identify a person”). This data includes FRT, fingerprint and handprint recognition technology, voice recognition technology, iris and retina recognition technology, DNA sequencing technology and gait recognition technology.
Although New York had originally placed a three-year ban on all biometric technology use in K-12 classrooms, schools and districts are now permitted to determine “appropriate uses.” However, New York State schools continue to be prohibited from purchasing or utilizing FRT specifically. Because each technology and application differs, schools must balance the risks and benefits of the technology, as well as the costs relative to school funding, the age and demographics of students impacted and the goals of the use of the technology. According to New York State Education Department Commissioner Betty A. Rosa, these determinations would require weighing the “technology’s privacy implications, impact on civil rights, effectiveness, and parental input.”
Another concern highlighted by technology ethics scholars is the student-faculty power imbalance created by FRT. Researchers at the University of Michigan have released a study strongly recommending against use of FRT in schools, finding that its use would likely have five implications: (1) exacerbating racism, (2) normalizing surveillance and eroding privacy, (3) narrowing the definition of the “acceptable student,” (4) commodifying data and (5) institutionalizing inaccuracy. These findings do not exist in a vacuum, as the New York Office of Information Technology Services reported similar findings. Accordingly, educators ought to consider the demographic makeup of their classrooms, among other variables. Because many facial recognition programs are tested on white men, the end result may create racial and gender bias. For example, Oprah Winfrey, Michelle Obama and Serena Williams have been misgendered and incorrectly identified based on their skin color. Therefore, as it currently exists, facial recognition software may perpetuate significant discriminatory systems in behavior management, academic data analysis and psychological assessment. While FRT offers many potential benefits in security and monitoring, the expansion of FRT into educational and other sensitive settings necessitates thoughtful consideration of privacy and ethical concerns, equity and societal norms. Lutzker & Lutzker will continue to provide updates on this important intersection of privacy and technology.