Learning a tree-structured ising model in order to make predictions
Author(s)
Bresler, Guy; Karzand, Mina
DownloadAccepted version (1.201Mb)
Open Access Policy
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
We study the problem of learning a tree Ising model from samples such that subsequent predictions made using the model are accurate. The prediction task considered in this paper is that of predicting the values of a subset of variables given values of some other subset of variables. Virtually all previous work on graphical model learning has focused on recovering the true underlying graph. We define a distance (“small set TV” or ssTV) between distributions P and Q by taking the maximum, over all subsets S of a given size, of the total variation between the marginals of P and Q on S; this distance captures the accuracy of the prediction task of interest. We derive nonasymptotic bounds on the number of samples needed to get a distribution (from the same class) with small ssTV relative to the one generating the samples. One of the main messages of this paper is that far fewer samples are needed than for recovering the underlying tree, which means that accurate predictions are possible using the wrong tree.
Date issued
2020-04Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science; Massachusetts Institute of Technology. Laboratory for Information and Decision SystemsJournal
Annals of Statistics
Publisher
Institute of Mathematical Statistics
Citation
Bresler, Guy and Mina Karzand. “Learning a tree-structured ising model in order to make predictions.” Annals of Statistics, 48, 2 (April 2020): 713-737 © 2020 The Author(s)
Version: Author's final manuscript
ISSN
0090-5364