Show simple item record

dc.contributor.authorDaskalakis, C
dc.contributor.authorTzamos, C
dc.contributor.authorZampetakis, M
dc.date.accessioned2022-06-17T14:42:26Z
dc.date.available2022-06-17T14:42:26Z
dc.date.issued2018-01-01
dc.identifier.urihttps://hdl.handle.net/1721.1/143462
dc.description.abstractCopyright 2018 by the author(s). We study the convergence properties of the Expectation-Maximization algorithm in the Naive Bayes model. We show that EM can get stuck in regions of slow convergence, even when the features are binary and i.i.d. conditioning on the class label, and even under random (i.e. non worst-case) initialization. In turn, we show that EM can be bootstrapped in a pre-training step that computes a good initialization. From this initialization we show theoretically and experimentally that EM converges exponentially fast to the true model parameters. Our bootstrapping method amounts to running the EM algorithm on appropriately centered iterates of small magnitude, which as we show corresponds to effectively performing power iteration on the covariance matrix of the mixture model, although power iteration is performed under the hood by EM itself. As such, we call our bootstrapping approach “power EM.” Specifically for the case of two binary features, we show global exponentially fast convergence of EM, even without bootstrapping. Finally, as the Naive Bayes model is quite expressive, we show as corollaries of our convergence results that the EM algorithm globally converges to the true model parameters for mixtures of two Gaussians, recovering recent results of [XHM16, DTZ17].en_US
dc.language.isoen
dc.relation.isversionofhttps://proceedings.mlr.press/v84/daskalakis18a.htmlen_US
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.en_US
dc.sourceProceedings of Machine Learning Researchen_US
dc.titleBootstrapping EM via power EM and convergence in the naive bayes modelen_US
dc.typeArticleen_US
dc.identifier.citationDaskalakis, C, Tzamos, C and Zampetakis, M. 2018. "Bootstrapping EM via power EM and convergence in the naive bayes model." International Conference on Artificial Intelligence and Statistics, AISTATS 2018.
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
dc.relation.journalInternational Conference on Artificial Intelligence and Statistics, AISTATS 2018en_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2022-06-17T14:31:05Z
dspace.orderedauthorsDaskalakis, C; Tzamos, C; Zampetakis, Men_US
dspace.date.submission2022-06-17T14:31:06Z
mit.licensePUBLISHER_POLICY
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record