Design and Analysis of Experiments in Networks: Reducing Bias from Interference
Author(s)
Karrer, Brian; Ugander, Johan; Eckles, Dean Griffin
Download1404.7530.pdf (560.5Kb)
OPEN_ACCESS_POLICY
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
Estimating the effects of interventions in networks is complicated due to interference, such that the outcomes for one experimental unit may depend on the treatment assignments of other units. Familiar statistical formalism, experimental designs, and analysis methods assume the absence of this interference, and result in biased estimates of causal effects when it exists. While some assumptions can lead to unbiased estimates, these assumptions are generally unrealistic in the context of a network and often amount to assuming away the interference. In this work, we evaluate methods for designing and analyzing randomized experiments under minimal, realistic assumptions compatible with broad interference, where the aim is to reduce bias and possibly overall error in estimates of average effects of a global treatment. In design, we consider the ability to perform random assignment to treatments that is correlated in the network, such as through graph cluster randomization. In analysis, we consider incorporating information about the treatment assignment of network neighbors. We prove sufficient conditions for bias reduction through both design and analysis in the presence of potentially global interference; these conditions also give lower bounds on treatment effects. Through simulations of the entire process of experimentation in networks, we measure the performance of these methods under varied network structure and varied social behaviors, finding substantial bias reductions and, despite a bias–variance tradeoff, error reductions. These improvements are largest for networks with more clustering and data generating processes with both stronger direct effects of the treatment and stronger interactions between units. Keywords: causal inference; field experiments; peer effects; spillovers; social contagion; social network analysis; graph partitioning
Date issued
2019-03-05Department
Sloan School of ManagementJournal
Journal of Causal Inference
Publisher
Walter de Gruyter GmbH
Citation
Eckles, Dean et al. “Design and Analysis of Experiments in Networks: Reducing Bias from Interference.” Journal of Causal Inference 5, 1 (January 2016): 20150021 © 2017 Walter de Gruyter GmbH, Berlin/Boston
Version: Original manuscript
ISSN
2193-3685