Show simple item record

dc.contributor.authorGarcia, Ana C. B.
dc.contributor.authorGarcia, Marcio G. P.
dc.contributor.authorRigobon, Roberto
dc.date.accessioned2023-05-22T13:47:40Z
dc.date.available2023-05-22T13:47:40Z
dc.date.issued2023-05-17
dc.identifier.urihttps://hdl.handle.net/1721.1/150785
dc.description.abstractAbstract The widespread usage of machine learning systems and econometric methods in the credit domain has transformed the decision-making process for evaluating loan applications. Automated analysis of credit applications diminishes the subjectivity of the decision-making process. On the other hand, since machine learning is based on past decisions recorded in the financial institutions’ datasets, the process very often consolidates existing bias and prejudice against groups defined by race, sex, sexual orientation, and other attributes. Therefore, the interest in identifying, preventing, and mitigating algorithmic discrimination has grown exponentially in many areas, such as Computer Science, Economics, Law, and Social Science. We conducted a comprehensive systematic literature review to understand (1) the research settings, including the discrimination theory foundation, the legal framework, and the applicable fairness metric; (2) the addressed issues and solutions; and (3) the open challenges for potential future research. We explored five sources: ACM Digital Library, Google Scholar, IEEE Digital Library, Springer Link, and Scopus. Following inclusion and exclusion criteria, we selected 78 papers written in English and published between 2017 and 2022. According to the meta-analysis of this literature survey, algorithmic discrimination has been addressed mainly by looking at the CS, Law, and Economics perspectives. There has been great interest in this topic in the financial area, especially the discrimination in providing access to the mortgage market and differential treatment (different fees, number of parcels, and interest rates). Most attention has been devoted to the potential discrimination due to bias in the dataset. Researchers are still only dealing with direct discrimination, addressed by algorithmic fairness, while indirect discrimination (structural discrimination) has not received the same attention.en_US
dc.publisherSpringer Londonen_US
dc.relation.isversionofhttps://doi.org/10.1007/s00146-023-01676-3en_US
dc.rightsCreative Commons Attributionen_US
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/en_US
dc.sourceSpringer Londonen_US
dc.titleAlgorithmic discrimination in the credit domain: what do we know about it?en_US
dc.typeArticleen_US
dc.identifier.citationGarcia, Ana C. B., Garcia, Marcio G. P. and Rigobon, Roberto. 2023. "Algorithmic discrimination in the credit domain: what do we know about it?."
dc.contributor.departmentSloan School of Management
dc.identifier.mitlicensePUBLISHER_CC
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dc.date.updated2023-05-21T03:11:51Z
dc.language.rfc3066en
dc.rights.holderThe Author(s)
dspace.embargo.termsN
dspace.date.submission2023-05-21T03:11:51Z
mit.licensePUBLISHER_CC
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record