Box drawings for learning with imbalanced data
Author(s)
Goh, Siong Thye; Rudin, Cynthia
DownloadRudin_Box drawings.pdf (308.9Kb)
OPEN_ACCESS_POLICY
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
The vast majority of real world classification problems are imbalanced, meaning there are far fewer data from the class of interest (the positive class) than from other classes. We propose two machine learning algorithms to handle highly imbalanced classification problems. The classifiers are disjunctions of conjunctions, and are created as unions of parallel axis rectangles around the positive examples, and thus have the benefit of being interpretable. The first algorithm uses mixed integer programming to optimize a weighted balance between positive and negative class accuracies. Regularization is introduced to improve generalization performance. The second method uses an approximation in order to assist with scalability. Specifically, it follows a \textit{characterize then discriminate} approach, where the positive class is characterized first by boxes, and then each box boundary becomes a separate discriminative classifier. This method has the computational advantages that it can be easily parallelized, and considers only the relevant regions of feature space.
Date issued
2014-08Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory; Massachusetts Institute of Technology. Operations Research Center; Sloan School of ManagementJournal
Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining (KDD '14)
Publisher
Association for Computing Machinery (ACM)
Citation
Siong Thye Goh and Cynthia Rudin. 2014. Box drawings for learning with imbalanced data. In Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining (KDD '14). ACM, New York, NY, USA, 333-342.
Version: Author's final manuscript
ISBN
9781450329569