Two-sided exponential concentration bounds for Bayes error rate and Shannon entropy
Author(s)
Honorio, Jean; Jaakkola, Tommi S.
DownloadJaakkola_Two-sided exponential.pdf (355.3Kb)
PUBLISHER_POLICY
Publisher Policy
Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
Terms of use
Metadata
Show full item recordAbstract
We provide a method that approximates the Bayes error rate and the Shannon entropy with high probability. The Bayes error rate approximation makes possible to build a classifier that polynomially approaches Bayes error rate. The Shannon entropy approximation provides provable performance guarantees for learning trees and Bayesian networks from continuous variables. Our results rely on some reasonable regularity conditions of the unknown probability distributions, and apply to bounded as well as unbounded variables.
Date issued
2013Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer ScienceJournal
Journal of Machine Learning Research
Publisher
Association for Computing Machinery (ACM)
Citation
Honorio, Jean, and Tommi Jaakkola. "Two-sided exponential concentration bounds for Bayes error rate and Shannon entropy." Journal of Machine Learning Research W&CP 28(3): 459–467 (2013).
Version: Final published version
ISSN
1532-4435
1533-7928