Show simple item record

Files in this item

Cover for What's in Your Face? Discrimination in Facial Recognition Technology
dc.contributor.advisorMacCarthy, Mark
dc.creator
dc.date.accessioned2018-06-22T13:37:25Z
dc.date.available2018-06-22T13:37:25Z
dc.date.created2018
dc.date.issued
dc.date.submitted01/01/2018
dc.identifier.otherAPT-BAG: georgetown.edu.10822_1050752.tar;APT-ETAG: 88b58b111790a9dc0fb4a974747c50a2; APT-DATE: 2019-03-28_10:33:03en_US
dc.identifier.uri
dc.descriptionM.A.
dc.description.abstractThis paper examines the discrimination in facial recognition technology (FRT) and how to mitigate it in the contexts of academia, product development, and industrial research. FRT is the automation of the processing of human faces. In recent years, given the fast development of machine learning techniques, FRT gained considerable momentum. FRT is increasingly trained on extraordinarily large datasets and sophisticated algorithms, and its accuracy has been increased to the point that surpasses human capacity. Applications of FRT emerge in a variety of fields, such as surveillance, military, security, and e-commerce. At the same time, many ethical issues have been raised. In this paper, two types of FRT applications are distinguished—identification and classification. The former aims to search and match the captured face in the target database to pinpoint the identity, while the latter classifies people into different groups according to some properties drawn from their facial features, for example, gender, race, age, and sexual orientation. The latter type raises serious discrimination issues, because the training data is inherently biased, and it could be easily used to develop discriminatory applications and increase the number of people who suffer from discrimination. In order to mitigate the discrimination issue, three types of FRT design practices are identified—product development, academic research, and industrial research. Value Sensitive Design (VSD) is a helpful approach to minimize discriminatory issues in product development. In academic settings, the traditional way to ensure ethical outcomes is through institutional review boards (IRB), but IRB has many disadvantages when dealing with FRT and data science in general. In industrial research, Facebook’s ethical review system developed after the “emotion contagion” study is discussed as a case study to demonstrate general principles that could help private companies in the FRT field to mitigate discrimination issues in research, such as ethical training and building multidisciplinary reviewing teams.
dc.formatPDF
dc.format.extent77 leaves
dc.languageen
dc.publisherGeorgetown University
dc.sourceGeorgetown University-Graduate School of Arts & Sciences
dc.sourceCommunication, Culture & Technology
dc.subjectartificial intelligence
dc.subjectdiscrimination
dc.subjectethics
dc.subjectfacial recognition
dc.subjectmachine learning
dc.subjectvalue sensitive design
dc.subject.lcshInformation technology
dc.subject.lcshEthics
dc.subject.otherInformation technology
dc.subject.otherEthics
dc.titleWhat's in Your Face? Discrimination in Facial Recognition Technology
dc.typethesis


This item appears in the following Collection(s)

Show simple item record