MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • Computer Science and Artificial Intelligence Lab (CSAIL)
  • Artificial Intelligence Lab Publications
  • AI Memos (1959 - 2004)
  • View Item
  • DSpace@MIT Home
  • Computer Science and Artificial Intelligence Lab (CSAIL)
  • Artificial Intelligence Lab Publications
  • AI Memos (1959 - 2004)
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Statistical Learning: Stability is Sufficient for Generalization and Necessary and Sufficient for Consistency of Empirical Risk Minimization

Author(s)
Mukherjee, Sayan; Niyogi, Partha; Poggio, Tomaso; Rifkin, Ryan
Thumbnail
DownloadAIM-2002-024.ps (1.768Mb)
Additional downloads
AIM-2002-024.pdf (391.1Kb)
Metadata
Show full item record
Abstract
Solutions of learning problems by Empirical Risk Minimization (ERM) need to be consistent, so that they may be predictive. They also need to be well-posed, so that they can be used robustly. We show that a statistical form of well-posedness, defined in terms of the key property of L-stability, is necessary and sufficient for consistency of ERM.
Description
revised July 2003
Date issued
2002-12-01
URI
http://hdl.handle.net/1721.3/5507
Other identifiers
AIM-2002-024
CBCL-223
Series/Report no.
AIM-2002-024CBCL-223
Keywords
AI, Theory of Learning, Great Discoveries, Consistency, ERM, Stability

Collections
  • AI Memos (1959 - 2004)
  • CBCL Memos (1993 - 2004)

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.