Show simple item record

dc.contributor.authorYun, Chulhee
dc.contributor.authorSra, Suvrit
dc.contributor.authorJadbabaie, Ali
dc.date.accessioned2021-11-05T14:27:01Z
dc.date.available2021-11-05T14:27:01Z
dc.date.issued2019
dc.identifier.urihttps://hdl.handle.net/1721.1/137480
dc.description.abstract© 2019 Neural information processing systems foundation. All rights reserved. We study finite sample expressivity, i.e., memorization power of ReLU networks. Recent results require N hidden nodes to memorize/interpolate arbitrary N data points. In contrast, by exploiting depth, we show that 3-layer ReLU networks with ?(vN) hidden nodes can perfectly memorize most datasets with N points. We also prove that width T(vN) is necessary and sufficient for memorizing N data points, proving tight bounds on memorization capacity. The sufficiency result can be extended to deeper networks; we show that an L-layer network with W parameters in the hidden layers can memorize N data points if W = ?(N). Combined with a recent upper bound O(WLlog W) on VC dimension, our construction is nearly tight for any fixed L. Subsequently, we analyze memorization capacity of residual networks under a general position assumption; we prove results that substantially reduce the known requirement of N hidden nodes. Finally, we study the dynamics of stochastic gradient descent (SGD), and show that when initialized near a memorizing global minimum of the empirical risk, SGD quickly finds a nearby point with much smaller empirical risk.en_US
dc.language.isoen
dc.relation.isversionofhttps://papers.nips.cc/paper/2019/hash/dbea3d0e2a17c170c412c74273778159-Abstract.htmlen_US
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.en_US
dc.sourceNeural Information Processing Systems (NIPS)en_US
dc.titleSmall ReLU networks are powerful memorizers: A tight analysis of memorization capacityen_US
dc.typeArticleen_US
dc.identifier.citationYun, Chulhee, Sra, Suvrit and Jadbabaie, Ali. 2019. "Small ReLU networks are powerful memorizers: A tight analysis of memorization capacity." Advances in Neural Information Processing Systems, 32.
dc.contributor.departmentMassachusetts Institute of Technology. Laboratory for Information and Decision Systems
dc.contributor.departmentMassachusetts Institute of Technology. Institute for Data, Systems, and Society
dc.relation.journalAdvances in Neural Information Processing Systemsen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2021-03-25T18:14:08Z
dspace.orderedauthorsYun, C; Sra, S; Jadbabaie, Aen_US
dspace.date.submission2021-03-25T18:14:10Z
mit.journal.volume32en_US
mit.licensePUBLISHER_POLICY
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record