Show simple item record

dc.contributor.authorMhaskar, HN
dc.contributor.authorPoggio, Tomaso A
dc.date.accessioned2021-12-02T19:58:25Z
dc.date.available2021-12-02T19:58:25Z
dc.date.issued2020
dc.identifier.urihttps://hdl.handle.net/1721.1/138295
dc.description.abstract© 2019 Elsevier Ltd This paper is motivated by an open problem around deep networks, namely, the apparent absence of over-fitting despite large over-parametrization which allows perfect fitting of the training data. In this paper, we analyze this phenomenon in the case of regression problems when each unit evaluates a periodic activation function. We argue that the minimal expected value of the square loss is inappropriate to measure the generalization error in approximation of compositional functions in order to take full advantage of the compositional structure. Instead, we measure the generalization error in the sense of maximum loss, and sometimes, as a pointwise error. We give estimates on exactly how many parameters ensure both zero training error as well as a good generalization error. We prove that a solution of a regularization problem is guaranteed to yield a good training error as well as a good generalization error and estimate how much error to expect at which test data.en_US
dc.language.isoen
dc.publisherElsevier BVen_US
dc.relation.isversionof10.1016/J.NEUNET.2019.08.028en_US
dc.rightsCreative Commons Attribution-NonCommercial-NoDerivs Licenseen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/en_US
dc.sourcearXiven_US
dc.titleAn analysis of training and generalization errors in shallow and deep networksen_US
dc.typeArticleen_US
dc.identifier.citationMhaskar, HN and Poggio, T. 2020. "An analysis of training and generalization errors in shallow and deep networks." Neural Networks, 121.
dc.contributor.departmentCenter for Brains, Minds, and Machines
dc.contributor.departmentMcGovern Institute for Brain Research at MIT
dc.relation.journalNeural Networksen_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dc.date.updated2021-12-02T19:51:52Z
dspace.orderedauthorsMhaskar, HN; Poggio, Ten_US
dspace.date.submission2021-12-02T19:51:53Z
mit.journal.volume121en_US
mit.licensePUBLISHER_CC
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record