dc.contributor.author | Daskalakis, C | |
dc.contributor.author | Tzamos, C | |
dc.contributor.author | Zampetakis, M | |
dc.date.accessioned | 2022-06-17T14:42:26Z | |
dc.date.available | 2022-06-17T14:42:26Z | |
dc.date.issued | 2018-01-01 | |
dc.identifier.uri | https://hdl.handle.net/1721.1/143462 | |
dc.description.abstract | Copyright 2018 by the author(s). We study the convergence properties of the Expectation-Maximization algorithm in the Naive Bayes model. We show that EM can get stuck in regions of slow convergence, even when the features are binary and i.i.d. conditioning on the class label, and even under random (i.e. non worst-case) initialization. In turn, we show that EM can be bootstrapped in a pre-training step that computes a good initialization. From this initialization we show theoretically and experimentally that EM converges exponentially fast to the true model parameters. Our bootstrapping method amounts to running the EM algorithm on appropriately centered iterates of small magnitude, which as we show corresponds to effectively performing power iteration on the covariance matrix of the mixture model, although power iteration is performed under the hood by EM itself. As such, we call our bootstrapping approach “power EM.” Specifically for the case of two binary features, we show global exponentially fast convergence of EM, even without bootstrapping. Finally, as the Naive Bayes model is quite expressive, we show as corollaries of our convergence results that the EM algorithm globally converges to the true model parameters for mixtures of two Gaussians, recovering recent results of [XHM16, DTZ17]. | en_US |
dc.language.iso | en | |
dc.relation.isversionof | https://proceedings.mlr.press/v84/daskalakis18a.html | en_US |
dc.rights | Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use. | en_US |
dc.source | Proceedings of Machine Learning Research | en_US |
dc.title | Bootstrapping EM via power EM and convergence in the naive bayes model | en_US |
dc.type | Article | en_US |
dc.identifier.citation | Daskalakis, C, Tzamos, C and Zampetakis, M. 2018. "Bootstrapping EM via power EM and convergence in the naive bayes model." International Conference on Artificial Intelligence and Statistics, AISTATS 2018. | |
dc.contributor.department | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science | |
dc.contributor.department | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory | |
dc.relation.journal | International Conference on Artificial Intelligence and Statistics, AISTATS 2018 | en_US |
dc.eprint.version | Final published version | en_US |
dc.type.uri | http://purl.org/eprint/type/ConferencePaper | en_US |
eprint.status | http://purl.org/eprint/status/NonPeerReviewed | en_US |
dc.date.updated | 2022-06-17T14:31:05Z | |
dspace.orderedauthors | Daskalakis, C; Tzamos, C; Zampetakis, M | en_US |
dspace.date.submission | 2022-06-17T14:31:06Z | |
mit.license | PUBLISHER_POLICY | |
mit.metadata.status | Authority Work and Publication Information Needed | en_US |