Towards usable machine learning
Author(s)
Zytek, Alexandra(Alexandra Katrima)
Download1252064714-MIT.pdf (3.577Mb)
Other Contributors
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science.
Advisor
Kalyan Veeramachaneni.
Terms of use
Metadata
Show full item recordAbstract
Machine learning (ML) is being applied to a diverse and ever-growing set of domains. In many cases, domain experts--who often have no expertise in ML or data science-- are asked to use ML predictions to make high-stakes decisions. Multiple ML usability challenges can appear as result, such as lack of user trust in the model, inability to reconcile human-ML disagreement, and ethical concerns about oversimplification of complex problems to a single algorithm output. In this thesis, we investigate the ML usability challenges present in non-technical, high-stakes domains, through a case study in the domain of child welfare screening. This study was conducted through a series of collaborations with child welfare screeners, which included field observations, interviews, and a formal user study. Through these collaborations, we identified four key ML usability challenges, and honed in on one promising ML augmentation tool to address them (local factor contributions). This thesis also includes list of design considerations to be taken into account when developing future augmentation tools for child welfare screeners and similar domain experts. Finally, we address the remaining challenges facing the ML community when making ML models more usable in diverse domains.
Description
Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, February, 2021 Cataloged from the official PDF version of thesis. Includes bibliographical references (pages 71-74).
Date issued
2021Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer SciencePublisher
Massachusetts Institute of Technology
Keywords
Electrical Engineering and Computer Science.