Tight bounds for the expected risk of linear classifiers and PAC-bayes finite-sample guarantees
Author(s)Honorio Carrillo, Jean; Jaakkola, Tommi S.
MetadataShow full item record
We analyze the expected risk of linear classifiers for a fixed weight vector in the “minimax” setting. That is, we analyze the worst-case risk among all data distributions with a given mean and covariance. We provide a simpler proof of the tight polynomial-tail bound for general random variables. For sub-Gaussian random variables, we derive a novel tight exponential bound. We also provide new PAC-Bayes finite-sample guarantees when training data is available. Our “minimax” generalization bounds are dimensionality-independent and O(√1/m) for m samples.
DepartmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Proceedings of the 17th International Conference on Artificial Intelligence and Statistics (AISTATS)
Journal of Machine Learning Research
Honorio, Jean and Tomi Jaakkola. "Tight bounds for the expected risk of linear classifiers and PAC-bayes finite-sample guarantees." Proceedings of the 17th International Conference on Artificial Intelligence and Statistics (AISTATS), 22-25 April 2014, Reykjavik, Iceland, Journal of Machine Learning Research, 2014.
Author's final manuscript