Show simple item record

dc.contributor.authorKizilcec, René F
dc.contributor.authorReich, Justin
dc.contributor.authorYeomans, Michael
dc.contributor.authorDann, Christoph
dc.contributor.authorBrunskill, Emma
dc.contributor.authorLopez, Glenn
dc.contributor.authorTurkay, Selen
dc.contributor.authorWilliams, Joseph Jay
dc.contributor.authorTingley, Dustin
dc.date.accessioned2021-10-27T20:22:35Z
dc.date.available2021-10-27T20:22:35Z
dc.date.issued2020
dc.identifier.urihttps://hdl.handle.net/1721.1/135236
dc.description.abstract<jats:p>Online education is rapidly expanding in response to rising demand for higher and continuing education, but many online students struggle to achieve their educational goals. Several behavioral science interventions have shown promise in raising student persistence and completion rates in a handful of courses, but evidence of their effectiveness across diverse educational contexts is limited. In this study, we test a set of established interventions over 2.5 y, with one-quarter million students, from nearly every country, across 247 online courses offered by Harvard, the Massachusetts Institute of Technology, and Stanford. We hypothesized that the interventions would produce medium-to-large effects as in prior studies, but this is not supported by our results. Instead, using an iterative scientific process of cyclically preregistering new hypotheses in between waves of data collection, we identified individual, contextual, and temporal conditions under which the interventions benefit students. Self-regulation interventions raised student engagement in the first few weeks but not final completion rates. Value-relevance interventions raised completion rates in developing countries to close the global achievement gap, but only in courses with a global gap. We found minimal evidence that state-of-the-art machine learning methods can forecast the occurrence of a global gap or learn effective individualized intervention policies. Scaling behavioral science interventions across various online learning contexts can reduce their average effectiveness by an order-of-magnitude. However, iterative scientific investigations can uncover what works where for whom.</jats:p>
dc.language.isoen
dc.publisherProceedings of the National Academy of Sciences
dc.relation.isversionof10.1073/PNAS.1921417117
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
dc.sourcePNAS
dc.titleScaling up behavioral science interventions in online education
dc.typeArticle
dc.contributor.departmentMassachusetts Institute of Technology. Program in Comparative Media Studies/Writing
dc.relation.journalProceedings of the National Academy of Sciences of the United States of America
dc.eprint.versionFinal published version
dc.type.urihttp://purl.org/eprint/type/JournalArticle
eprint.statushttp://purl.org/eprint/status/PeerReviewed
dc.date.updated2021-03-19T13:53:39Z
dspace.orderedauthorsKizilcec, RF; Reich, J; Yeomans, M; Dann, C; Brunskill, E; Lopez, G; Turkay, S; Williams, JJ; Tingley, D
dspace.date.submission2021-03-19T13:53:40Z
mit.journal.volume117
mit.journal.issue26
mit.licensePUBLISHER_POLICY
mit.metadata.statusAuthority Work and Publication Information Needed


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record