Show simple item record

dc.contributor.advisorDaniel J. Weitzner.en_US
dc.contributor.authorAbuhamad, Grace M.(Grace Marie)en_US
dc.contributor.otherMassachusetts Institute of Technology. Institute for Data, Systems, and Society.en_US
dc.contributor.otherTechnology and Policy Program.en_US
dc.date.accessioned2019-09-16T18:17:15Z
dc.date.available2019-09-16T18:17:15Z
dc.date.copyright2019en_US
dc.date.issued2019en_US
dc.identifier.urihttps://hdl.handle.net/1721.1/122094
dc.descriptionThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.en_US
dc.descriptionThesis: S.M. in Technology and Policy, Massachusetts Institute of Technology, School of Engineering, Institute for Data, Systems, and Society, 2019en_US
dc.descriptionCataloged from student-submitted PDF version of thesis.en_US
dc.descriptionIncludes bibliographical references (pages 75-82).en_US
dc.description.abstractFifty years ago, the United States Congress coalesced around a vision for fair consumer credit: equally accessible by all consumers, and developed on accurate and relevant information, with controls for consumer privacy. In two foundational pieces of legislation, the Fair Credit Reporting Act (FCRA) and the Equal Credit Opportunity Act (ECOA), legislators described mechanisms by which these goals would be met, including, most notably, prohibiting certain information, such as a consumer's race, as the basis for credit decisions, under the assumption that being "blind" to this information would prevent wrongful discrimination. While the policy goals for fair credit are still valid today, the mechanisms designed to achieve them are no longer effective.en_US
dc.description.abstractThe consumer credit industry is increasingly interested in using new data and machine learning modeling techniques to determine consumer creditworthiness, and with these technological advances come new risks not mitigated by existing mechanisms. This thesis evaluates how these "alternative" credit processes pose challenges to the mechanisms established in the FCRA and the ECOA and their vision for fairness. "Alternative" data and models facilitate inference or prediction of consumer information, which make them non-compliant. In particular, this thesis investigates the idea that "blindness" to certain attributes hinders consumer fairness more than it helps since it limits the ability to determine whether wrongful discrimination has occurred and to build better performing models for populations that have been historically underscored.en_US
dc.description.abstractThis thesis concludes with four recommendations to modernize fairness mechanisms and ensure trust in the consumer credit system by: 1) expanding the definition of consumer report under the FCRA; 2) encouraging model explanations and transparency; 3) requiring self-testing using prohibited information; and 4) permitting the use of prohibited information to allow for more comprehensive models.en_US
dc.description.sponsorshipThis work was partially supported by the MIT-IBM Watson AI Lab and the Hewlett Foundation through the MIT Internet Policy Research Initiative (IPRI)en_US
dc.description.statementofresponsibilityby Grace M. Abuhamad.en_US
dc.format.extent82 pagesen_US
dc.language.isoengen_US
dc.publisherMassachusetts Institute of Technologyen_US
dc.rightsMIT theses are protected by copyright. They may be viewed, downloaded, or printed from this source but further reproduction or distribution in any format is prohibited without written permission.en_US
dc.rights.urihttp://dspace.mit.edu/handle/1721.1/7582en_US
dc.subjectInstitute for Data, Systems, and Society.en_US
dc.subjectTechnology and Policy Program.en_US
dc.titleThe fallacy of equating "blindness" with fairness : ensuring trust in machine learning applications to consumer crediten_US
dc.typeThesisen_US
dc.description.degreeS.M. in Technology and Policyen_US
dc.contributor.departmentMassachusetts Institute of Technology. Institute for Data, Systems, and Societyen_US
dc.contributor.departmentMassachusetts Institute of Technology. Engineering Systems Division
dc.contributor.departmentTechnology and Policy Programen_US
dc.identifier.oclc1117710058en_US
dc.description.collectionS.M.inTechnologyandPolicy Massachusetts Institute of Technology, School of Engineering, Institute for Data, Systems, and Societyen_US
dspace.imported2019-09-16T18:17:11Zen_US
mit.thesis.degreeMasteren_US
mit.thesis.departmentESDen_US
mit.thesis.departmentIDSSen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record