MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Every Local Minimum Value Is the Global Minimum Value of Induced Model in Nonconvex Machine Learning

Author(s)
Kawaguchi, Kenji; Huang, Jiaoyang; Kaelbling, Leslie P
Thumbnail
DownloadPublished version (556.9Kb)
Publisher with Creative Commons License

Publisher with Creative Commons License

Creative Commons Attribution

Terms of use
Creative Commons Attribution 4.0 International license https://creativecommons.org/licenses/by/4.0/
Metadata
Show full item record
Abstract
For nonconvex optimization in machine learning, this article proves that every local minimum achieves the globally optimal value of the perturbable gradient basis model at any differentiable point. As a result, nonconvex machine learning is theoretically as supported as convex machine learning with a handcrafted basis in terms of the loss at differentiable local minima, except in the case when a preference is given to the handcrafted basis over the perturbable gradient basis. The proofs of these results are derived under mild assumptions. Accordingly, the proven results are directly applicable to many machine learning models, including practical deep neural networks, without any modification of practical methods. Furthermore, as special cases of our general results, this article improves or complements several state-of-the-art theoretical results on deep neural networks, deep residual networks, and overparameterized deep neural networks with a unified proof technique and novel geometric insights. A special case of our results also contributes to the theoretical foundation of representation learning.
Date issued
2019-12
URI
https://hdl.handle.net/1721.1/129666
Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Journal
Neural Computation
Publisher
MIT Press - Journals
Citation
Kawaguchi, Kenji et al. "Every Local Minimum Value Is the Global Minimum Value of Induced Model in Nonconvex Machine Learning." Neural Computation 31, 12 (December 2019): 2293-2323 © 2019 Massachusetts Institute of Technology
Version: Final published version
ISSN
0899-7667
1530-888X

Collections
  • MIT Open Access Articles

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.