Notice
This is not the latest version of this item. The latest version can be found at:https://dspace.mit.edu/handle/1721.1/137183.2
Data-dependent coresets for compressing neural networks with applications to generalization bounds
| dc.date.accessioned | 2021-11-03T14:15:59Z | |
| dc.date.available | 2021-11-03T14:15:59Z | |
| dc.date.issued | 2019-05 | |
| dc.identifier.uri | https://hdl.handle.net/1721.1/137183 | |
| dc.description.abstract | We present an efficient coresets-based neural network compression algorithm that sparsifies the parameters of a trained fully-connected neural network in a manner that provably approximates the network's output. Our approach is based on an importance sampling scheme that judiciously defines a sampling distribution over the neural network parameters, and as a result, retains parameters of high importance while discarding redundant ones. We leverage a novel, empirical notion of sensitivity and extend traditional coreset constructions to the application of compressing parameters. Our theoretical analysis establishes guarantees on the size and accuracy of the resulting compressed network and gives rise to generalization bounds that may provide new insights into the generalization properties of neural networks. We demonstrate the practical effectiveness of our algorithm on a variety of neural network configurations and real-world data sets. | en_US |
| dc.language.iso | en | |
| dc.rights | Creative Commons Attribution-Noncommercial-Share Alike | en_US |
| dc.rights.uri | http://creativecommons.org/licenses/by-nc-sa/4.0/ | en_US |
| dc.source | MIT web domain | en_US |
| dc.title | Data-dependent coresets for compressing neural networks with applications to generalization bounds | en_US |
| dc.type | Article | en_US |
| dc.identifier.citation | 2019. "Data-dependent coresets for compressing neural networks with applications to generalization bounds." 7th International Conference on Learning Representations, ICLR 2019. | |
| dc.relation.journal | 7th International Conference on Learning Representations, ICLR 2019 | en_US |
| dc.eprint.version | Author's final manuscript | en_US |
| dc.type.uri | http://purl.org/eprint/type/ConferencePaper | en_US |
| eprint.status | http://purl.org/eprint/status/NonPeerReviewed | en_US |
| dc.date.updated | 2021-04-15T16:53:09Z | |
| dspace.orderedauthors | Baykal, C; Liebenwein, L; Gilitschenski, I; Feldman, D; Rus, D | en_US |
| dspace.date.submission | 2021-04-15T16:53:10Z | |
| mit.license | OPEN_ACCESS_POLICY | |
| mit.metadata.status | Authority Work and Publication Information Needed | en_US |
