MIT Libraries homeMIT Libraries logoDSpace@MIT

MIT
Search 
  • DSpace@MIT Home
  • Center for Brains, Minds & Machines
  • Search
  • DSpace@MIT Home
  • Center for Brains, Minds & Machines
  • Search
JavaScript is disabled for your browser. Some features of this site may not work without it.

Search

Show Advanced FiltersHide Advanced Filters

Filters

Use filters to refine the search results.

Now showing items 1-6 of 6

  • Sort Options:
  • Relevance
  • Title Asc
  • Title Desc
  • Issue Date Asc
  • Issue Date Desc
  • Results Per Page:
  • 5
  • 10
  • 20
  • 40
  • 60
  • 80
  • 100
Thumbnail

Theory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality? 

Poggio, Tomaso; Mhaskar, Hrushikesh; Rosasco, Lorenzo; Miranda, Brando; Liao, Qianli (Center for Brains, Minds and Machines (CBMM), arXiv, 2016-11-23)
[formerly titled "Why and When Can Deep – but Not Shallow – Networks Avoid the Curse of Dimensionality: a Review"] The paper reviews and extends an emerging body of theoretical results on deep learning including the ...
Thumbnail

Musings on Deep Learning: Properties of SGD 

Zhang, Chiyuan; Liao, Qianli; Rakhlin, Alexander; Sridharan, Karthik; Miranda, Brando; e.a. (Center for Brains, Minds and Machines (CBMM), 2017-04-04)
[previously titled "Theory of Deep Learning III: Generalization Properties of SGD"] In Theory III we characterize with a mix of theory and experiments the generalization properties of Stochastic Gradient Descent in ...
Thumbnail

Theory IIIb: Generalization in Deep Networks 

Poggio, Tomaso; Liao, Qianli; Miranda, Brando; Burbanski, Andrzej; Hidary, Jack (Center for Brains, Minds and Machines (CBMM), arXiv.org, 2018-06-29)
The general features of the optimization problem for the case of overparametrized nonlinear networks have been clear for a while: SGD selects with high probability global minima vs local minima. In the overparametrized ...
Thumbnail

Theory of Deep Learning IIb: Optimization Properties of SGD 

Zhang, Chiyuan; Liao, Qianli; Rakhlin, Alexander; Miranda, Brando; Golowich, Noah; e.a. (Center for Brains, Minds and Machines (CBMM), 2017-12-27)
In Theory IIb we characterize with a mix of theory and experiments the optimization of deep convolutional networks by Stochastic Gradient Descent. The main new result in this paper is theoretical and experimental evidence ...
Thumbnail

Classical generalization bounds are surprisingly tight for Deep Networks 

Liao, Qianli; Miranda, Brando; Hidary, Jack; Poggio, Tomaso (Center for Brains, Minds and Machines (CBMM), 2018-07-11)
Deep networks are usually trained and tested in a regime in which the training classification error is not a good predictor of the test error. Thus the consensus has been that generalization, defined as convergence of the ...
Thumbnail

Theory of Deep Learning III: explaining the non-overfitting puzzle 

Poggio, Tomaso; Kawaguchi, Kenji; Liao, Qianli; Miranda, Brando; Rosasco, Lorenzo; e.a. (arXiv, 2017-12-30)
THIS MEMO IS REPLACED BY CBMM MEMO 90 A main puzzle of deep networks revolves around the absence of overfitting despite overparametrization and despite the large capacity demonstrated by zero training error on randomly ...

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CommunityBy Issue DateAuthorsTitlesSubjects

My Account

Login

Discover

Author
Liao, Qianli (6)
Miranda, Brando (6)
Poggio, Tomaso (6)Hidary, Jack (3)Mhaskar, Hrushikesh (2)Rakhlin, Alexander (2)Rosasco, Lorenzo (2)Zhang, Chiyuan (2)Boix, Xavier (1)Burbanski, Andrzej (1)... View MoreSubjectdeep convolutional networks (1)Deep Learning (1)... View MoreDate Issued2017 (3)2018 (2)2016 (1)Has File(s)Yes (6)

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries homeMIT Libraries logo

Find us on

Twitter Facebook Instagram YouTube RSS

MIT Libraries navigation

SearchHours & locationsBorrow & requestResearch supportAbout us
PrivacyPermissionsAccessibility
MIT
Massachusetts Institute of Technology
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.