Show simple item record

dc.contributor.authorYun, Chulee
dc.contributor.authorSra, Suvrit
dc.contributor.authorJadbabaie, Ali
dc.date.accessioned2021-11-05T13:44:44Z
dc.date.available2021-11-05T13:44:44Z
dc.date.issued2019
dc.identifier.urihttps://hdl.handle.net/1721.1/137454
dc.description.abstract© 7th International Conference on Learning Representations, ICLR 2019. All Rights Reserved. We investigate the loss surface of neural networks. We prove that even for one-hidden-layer networks with “slightest” nonlinearity, the empirical risks have spurious local minima in most cases. Our results thus indicate that in general “no spurious local minima” is a property limited to deep linear networks, and insights obtained from linear networks may not be robust. Specifically, for ReLU(-like) networks we constructively prove that for almost all practical datasets there exist infinitely many local minima. We also present a counterexample for more general activations (sigmoid, tanh, arctan, ReLU, etc.), for which there exists a bad local minimum. Our results make the least restrictive assumptions relative to existing results on spurious local optima in neural networks. We complete our discussion by presenting a comprehensive characterization of global optimality for deep linear networks, which unifies other results on this topic.en_US
dc.language.isoen
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourcearXiven_US
dc.titleSmall nonlinearities in activation functions create bad local minima in neural networksen_US
dc.typeArticleen_US
dc.identifier.citationYun, Chulee, Sra, Suvrit and Jadbabaie, Ali. 2019. "Small nonlinearities in activation functions create bad local minima in neural networks." 7th International Conference on Learning Representations, ICLR 2019.
dc.contributor.departmentMassachusetts Institute of Technology. Laboratory for Information and Decision Systems
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
dc.contributor.departmentMassachusetts Institute of Technology. Department of Civil and Environmental Engineering
dc.contributor.departmentMassachusetts Institute of Technology. Institute for Data, Systems, and Society
dc.relation.journal7th International Conference on Learning Representations, ICLR 2019en_US
dc.eprint.versionOriginal manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2021-04-12T17:31:46Z
dspace.orderedauthorsYun, C; Sra, S; Jadbabaie, Aen_US
dspace.date.submission2021-04-12T17:31:47Z
mit.licenseOPEN_ACCESS_POLICY
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record