Show simple item record

dc.contributor.authorKarrer, Brian
dc.contributor.authorUgander, Johan
dc.contributor.authorEckles, Dean Griffin
dc.date.accessioned2019-03-05T17:35:17Z
dc.date.available2019-03-05T17:35:17Z
dc.date.issued2019-03-05
dc.identifier.issn2193-3685
dc.identifier.urihttp://hdl.handle.net/1721.1/120732
dc.description.abstractEstimating the effects of interventions in networks is complicated due to interference, such that the outcomes for one experimental unit may depend on the treatment assignments of other units. Familiar statistical formalism, experimental designs, and analysis methods assume the absence of this interference, and result in biased estimates of causal effects when it exists. While some assumptions can lead to unbiased estimates, these assumptions are generally unrealistic in the context of a network and often amount to assuming away the interference. In this work, we evaluate methods for designing and analyzing randomized experiments under minimal, realistic assumptions compatible with broad interference, where the aim is to reduce bias and possibly overall error in estimates of average effects of a global treatment. In design, we consider the ability to perform random assignment to treatments that is correlated in the network, such as through graph cluster randomization. In analysis, we consider incorporating information about the treatment assignment of network neighbors. We prove sufficient conditions for bias reduction through both design and analysis in the presence of potentially global interference; these conditions also give lower bounds on treatment effects. Through simulations of the entire process of experimentation in networks, we measure the performance of these methods under varied network structure and varied social behaviors, finding substantial bias reductions and, despite a bias–variance tradeoff, error reductions. These improvements are largest for networks with more clustering and data generating processes with both stronger direct effects of the treatment and stronger interactions between units. Keywords: causal inference; field experiments; peer effects; spillovers; social contagion; social network analysis; graph partitioningen_US
dc.publisherWalter de Gruyter GmbHen_US
dc.relation.isversionofhttp://dx.doi.org/10.1515/JCI-2015-0021en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourcearXiven_US
dc.titleDesign and Analysis of Experiments in Networks: Reducing Bias from Interferenceen_US
dc.typeArticleen_US
dc.identifier.citationEckles, Dean et al. “Design and Analysis of Experiments in Networks: Reducing Bias from Interference.” Journal of Causal Inference 5, 1 (January 2016): 20150021 © 2017 Walter de Gruyter GmbH, Berlin/Bostonen_US
dc.contributor.departmentSloan School of Managementen_US
dc.contributor.mitauthorEckles, Dean Griffin
dc.relation.journalJournal of Causal Inferenceen_US
dc.eprint.versionOriginal manuscripten_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2019-02-07T19:43:35Z
dspace.orderedauthorsEckles, Dean; Karrer, Brian; Ugander, Johanen_US
dspace.embargo.termsNen_US
dc.identifier.orcidhttps://orcid.org/0000-0001-8439-442X
mit.licenseOPEN_ACCESS_POLICYen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record