Reviews/recent articles on facial recognition–the technology and the privacy concerns.
**updated January 2022**
Classic reviews:
*Bowyer, K. W. (2004). Face recognition technology: security versus privacy. IEEE Technology and Society Magazine, 23(1), 9-19. [PDF] [Cited by]
“Video surveillance and face recognition systems have become the subject of increased interest and controversy after the September 11 terrorist attacks on the United States. In favor of face recognition technology, there is the lure of a powerful tool to aid national security. On the negative side, there are fears of an Orwellian invasion of privacy. Given the ongoing nature of the controversy, and the fact that face recognition systems represent leading edge and rapidly changing technology, face recognition technology is currently a major issue in the area of social impact of technology. We analyze the interplay of technical and social issues involved in the widespread application of video surveillance for person identification. ”
*Froomkin, A. M. (2000). The Death of Privacy? Stanford Law Review, 52(5), 1461. [PDF] [Cited by]
“The rapid deployment of privacy-destroying technologies by governments and businesses threatens to make informational privacy obsolete. The first part of this article describes a range of current technologies to which the law has yet to respond effectively. These include: routine collection of transactional data, growing automated surveillance in public places,deployment of facial recognition technology and other biometrics, cell-phone tracking, vehicle tracking, satellite monitoring, workplace surveillance, internet tracking from cookies to“clicktrails,” hardware-based identifiers, intellectual property protecting “snitchware,” and sense-enhanced searches that allow observers to see through everything from walls to clothes. The cumulative and reinforcing effect of these technologies may make modern life completely visible and permeable to observers; there could be nowhere to hide. The second part of the article discusses leading attempts to craft legal responses to the assault on privacy–including self-regulation, privacy-enhancing technologies, data-protection law, and property-rights based solutions–in the context of three structural obstacles to privacy enhancement: consumers’ privacy myopia; important First Amendment protections of rights to collect and repeat information; and fear of what other people may do if not monitored.”
Other reviews/articles:
*Andrejevic, M., & Selwyn, N. (2020). Facial recognition technology in schools: Critical questions and concerns. Learning, Media and Technology, 45(2), 115-128. [PDF] [Cited by]
“Facial recognition technology is now being introduced across various aspects of public life. This includes the burgeoning integration of facial recognition and facial detection into compulsory schooling to address issues such as campus security, automated registration and student emotion detection. So far, these technologies have largely been seen as routine additions to school systems with already extensive cultures of monitoring and surveillance. While critical commentators are beginning to question the pedagogical limitations of facially driven learning, this article contends that school-based facial recognition presents a number of other social challenges and concerns that merit specific attention. This includes the likelihood of facial recognition technology altering the nature of schools and schooling along divisive, authoritarian and oppressive lines. Against this background, the article considers whether or not a valid case can ever be made for allowing this form of technology in schools.”
*Becerra-Riera, F., Morales-González, A., & Méndez-Vázquez, H. (2019). A survey on facial soft biometrics for video surveillance and forensic applications. Artificial Intelligence Review, 52(2), 1155-1187. [Cited by]
“The face is one of the most reliable and easy-to-acquire biometric features, widely used for the recognition of individuals. In controlled environments, facial recognition systems are highly effective, however, in real world scenarios and under varying lighting conditions, pose changes, facial expressions, occlusions and low resolution of captured images/videos, the task of recognizing faces becomes significantly complex. In this context, it has been shown that certain attributes can be retrieved with a relative probability of success, being useful to complement a non-conclusive result of a biometric system. In this paper, we present an overview on face describable visual attributes and in particular of the so-called soft biometrics (e.g., facial marks, gender, age, skin color, and other physical characteristics). We review core issues regarding this topic, for instance what are the soft biometrics, which of them are the most robust in video surveillance and other uncontrolled scenarios, how different approaches have been addressed in the literature for their representation and classification, which datasets can be used for evaluation, which related problems remain unresolved and which are the possible ways to approach them.”
*Crawford, K. (2019). Halt the use of facial-recognition technology until it is regulated. Nature, 572(7771), 565. [Cited by]
“Until appropriate safeguards are in place, we need a moratorium on biometric technology that identifies individuals. There is little evidence that biometric technology can identify suspects quickly or in real time. No peer-reviewed studies have shown convincing data that the technology has sufficient accuracy to meet the US constitutional standards of due process, probable cause and equal protection that are required for searches and arrests. ”
*Kaur, P., Krishan, K., Sharma, S. K., & Kanchan, T. (2020). Facial-recognition algorithms: A literature review. Medicine, Science, and the Law, 25802419893168. [Cited by]
“The face is an important part of the human body, distinguishing individuals in large groups of people. Thus, because of its universality and uniqueness, it has become the most widely used and accepted biometric method. The domain of face recognition has gained the attention of many scientists, and hence it has become a standard benchmark in the area of human recognition. It has turned out to be the most deeply studied area in computer vision for more than four decades. It has a wide array of applications, including security monitoring, automated surveillance systems, victim and missing-person identification and so on. This review presents the broad range of methods used for face recognition and attempts to discuss their advantages and disadvantages. Initially, we present the basics of face-recognition technology, its standard workflow, background and problems, and the potential applications. Then, face-recognition methods with their advantages and limitations are discussed. The concluding section presents the possibilities and future implications for further advancing the field.”
*Kosinski, M. (2021). Facial recognition technology can expose political orientation from naturalistic facial images. Scientific Reports, 11, 100. [PDF] [Cited by] **new**
“Ubiquitous facial recognition technology can expose individuals’ political orientation, as faces of liberals and conservatives consistently differ. A facial recognition algorithm was applied to naturalistic images of 1,085,795 individuals to predict their political orientation by comparing their similarity to faces of liberal and conservative others. Political orientation was correctly classified in 72% of liberal–conservative face pairs, remarkably better than chance (50%), human accuracy (55%), or one afforded by a 100-item personality questionnaire (66%). Accuracy was similar across countries (the U.S., Canada, and the UK), environments (Facebook and dating websites), and when comparing faces across samples. Accuracy remained high (69%) even when controlling for age, gender, and ethnicity. Given the widespread use of facial recognition, our findings have critical implications for the protection of privacy and civil liberties.”
*Martinez-Martin, N. (2019). What are important ethical implications of using facial recognition technology in health care? AMA Journal of Ethics, 21(2), E180-E187. [PDF] [Cited by]
“Applications of facial recognition technology (FRT) in health care settings have been developed to identify and monitor patients as well as to diagnose genetic, medical, and behavioral conditions. The use of FRT in health care suggests the importance of informed consent, data input and analysis quality, effective communication about incidental findings, and potential influence on patient-clinician relationships. Privacy and data protection are thought to present challenges for the use of FRT for health applications.”
*Wang, Y., & Kosinski, M. (2018). Deep neural networks are more accurate than humans at detecting sexual orientation from facial images. Journal of Personality and Social Psychology, 114(2), 246-257. [Cited by]
“We show that faces contain much more information about sexual orientation than can be perceived or interpreted by the human brain. We used deep neural networks to extract features from 35,326 facial images. These features were entered into a logistic regression aimed at classifying sexual orientation. Given a single facial image, a classifier could correctly distinguish between gay and heterosexual men in 81% of cases, and in 71% of cases for women. Human judges achieved much lower accuracy: 61% for men and 54% for women. The accuracy of the algorithm increased to 91% and 83%, respectively, given five facial images per person. Facial features employed by the classifier included both fixed (e.g., nose shape) and transient facial features (e.g., grooming style). Consistent with the prenatal hormone theory of sexual orientation, gay men and women tended to have gender-atypical facial morphology, expression, and grooming styles. Prediction models aimed at gender alone allowed for detecting gay males with 57% accuracy and gay females with 58% accuracy. Those findings advance our understanding of the origins of sexual orientation and the limits of human perception. Additionally, given that companies and governments are increasingly using computer vision algorithms to detect people’s intimate traits, our findings expose a threat to the privacy and safety of gay men and women.”
For additional sources about the technology of facial recognition and privacy concerns, please see Science Primary Literature (database).
Questions? Please let me know (engelk@grinnell.edu).
Pingback: Facial recognition technology: privacy and political orientation - Strategian Science