Distributionally robust submodular maximization
Author(s)
Staib, Matthew; Wilder, B; Jegelka, Stefanie Sabrina
DownloadSubmitted version (725.4Kb)
Open Access Policy
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
Submodular functions have applications throughout machine learning, but in many settings, we do not have direct access to the underlying function f. We focus on stochastic functions that are given as an expectation of functions over a distribution P. In practice, we often have only a limited set of samples fi from P. The standard approach indirectly optimizes f by maximizing the sum of fi. However, this ignores generalization to the true (unknown) distribution. In this paper, we achieve better performance on the actual underlying function f by directly optimizing a combination of bias and variance. Algorithmically, we accomplish this by showing how to carry out distributionally robust optimization (DRO) for submodular functions, providing efficient algorithms backed by theoretical guarantees which leverage several novel contributions to the general theory of DRO. We also show compelling empirical evidence that DRO improves generalization to the unknown stochastic submodular function.
Date issued
2019-04Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence LaboratoryJournal
22nd International Conference on Artificial Intelligence and Statistics
Publisher
MLResearchPress
Citation
Staib, Matthew et al. "Distributionally robust submodular maximization." 22nd International Conference on Artificial Intelligence and Statistics, April 2019, Naha, Okinawa, Japan, MLResearchPress, April 2019. © 2019 by the author(s)
Version: Original manuscript