The fallacy of equating "blindness" with fairness : ensuring trust in machine learning applications to consumer credit
Author(s)Abuhamad, Grace M.(Grace Marie)
Massachusetts Institute of Technology. Institute for Data, Systems, and Society.
Technology and Policy Program.
Daniel J. Weitzner.
MetadataShow full item record
Fifty years ago, the United States Congress coalesced around a vision for fair consumer credit: equally accessible by all consumers, and developed on accurate and relevant information, with controls for consumer privacy. In two foundational pieces of legislation, the Fair Credit Reporting Act (FCRA) and the Equal Credit Opportunity Act (ECOA), legislators described mechanisms by which these goals would be met, including, most notably, prohibiting certain information, such as a consumer's race, as the basis for credit decisions, under the assumption that being "blind" to this information would prevent wrongful discrimination. While the policy goals for fair credit are still valid today, the mechanisms designed to achieve them are no longer effective.The consumer credit industry is increasingly interested in using new data and machine learning modeling techniques to determine consumer creditworthiness, and with these technological advances come new risks not mitigated by existing mechanisms. This thesis evaluates how these "alternative" credit processes pose challenges to the mechanisms established in the FCRA and the ECOA and their vision for fairness. "Alternative" data and models facilitate inference or prediction of consumer information, which make them non-compliant. In particular, this thesis investigates the idea that "blindness" to certain attributes hinders consumer fairness more than it helps since it limits the ability to determine whether wrongful discrimination has occurred and to build better performing models for populations that have been historically underscored.This thesis concludes with four recommendations to modernize fairness mechanisms and ensure trust in the consumer credit system by: 1) expanding the definition of consumer report under the FCRA; 2) encouraging model explanations and transparency; 3) requiring self-testing using prohibited information; and 4) permitting the use of prohibited information to allow for more comprehensive models.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.Thesis: S.M. in Technology and Policy, Massachusetts Institute of Technology, School of Engineering, Institute for Data, Systems, and Society, 2019Cataloged from student-submitted PDF version of thesis.Includes bibliographical references (pages 75-82).
DepartmentMassachusetts Institute of Technology. Institute for Data, Systems, and Society; Massachusetts Institute of Technology. Engineering Systems Division; Technology and Policy Program
Massachusetts Institute of Technology
Institute for Data, Systems, and Society., Technology and Policy Program.