Show simple item record

dc.contributor.authorBaykal, Cenk
dc.contributor.authorLiebenwein, Lucas
dc.contributor.authorGilitschenski, Igor
dc.contributor.authorFeldman, Dan
dc.contributor.authorRus, Daniela L
dc.date.accessioned2022-09-16T16:09:26Z
dc.date.available2021-11-03T14:15:59Z
dc.date.available2022-09-16T16:09:26Z
dc.date.issued2019-05
dc.identifier.urihttps://hdl.handle.net/1721.1/137183.2
dc.description.abstractWe present an efficient coresets-based neural network compression algorithm that sparsifies the parameters of a trained fully-connected neural network in a manner that provably approximates the network's output. Our approach is based on an importance sampling scheme that judiciously defines a sampling distribution over the neural network parameters, and as a result, retains parameters of high importance while discarding redundant ones. We leverage a novel, empirical notion of sensitivity and extend traditional coreset constructions to the application of compressing parameters. Our theoretical analysis establishes guarantees on the size and accuracy of the resulting compressed network and gives rise to generalization bounds that may provide new insights into the generalization properties of neural networks. We demonstrate the practical effectiveness of our algorithm on a variety of neural network configurations and real-world data sets.en_US
dc.language.isoen
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourceMIT web domainen_US
dc.titleData-dependent coresets for compressing neural networks with applications to generalization boundsen_US
dc.typeArticleen_US
dc.identifier.citation2019. "Data-dependent coresets for compressing neural networks with applications to generalization bounds." 7th International Conference on Learning Representations, ICLR 2019.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratoryen_US
dc.relation.journal7th International Conference on Learning Representations, ICLR 2019en_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2021-04-15T16:53:09Z
dspace.orderedauthorsBaykal, C; Liebenwein, L; Gilitschenski, I; Feldman, D; Rus, Den_US
dspace.date.submission2021-04-15T16:53:10Z
mit.licenseOPEN_ACCESS_POLICY
mit.metadata.statusPublication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

VersionItemDateSummary

*Selected version