Notice

This is not the latest version of this item. The latest version can be found at:https://dspace.mit.edu/handle/1721.1/137766.2

Show simple item record

dc.contributor.authorPainsky, Amichai
dc.contributor.authorWornell, Gregory
dc.date.accessioned2021-11-08T18:32:56Z
dc.date.available2021-11-08T18:32:56Z
dc.date.issued2018-06
dc.identifier.urihttps://hdl.handle.net/1721.1/137766
dc.description.abstract© 2018 IEEE. A loss function measures the discrepancy between the true values (observations) and their estimated fits, for a given instance of data. A loss function is said to be proper (unbiased, Fisher consistent) if the fits are defined over a unit simplex, and the minimizer of the expected loss is the true underlying probability of the data. Typical examples are the zero-one loss, the quadratic loss and the Bernoulli log-likelihood loss (log-loss). In this work we show that for binary classification problems, the divergence associated with smooth, proper and convex loss functions is bounded from above by the Kullback-Leibler (KL) divergence, up to a multiplicative normalization constant. It implies that by minimizing the log-loss (associated with the KL divergence), we minimize an upper bound to any choice of loss functions from this set. This property justifies the broad use of log-loss in regression, decision trees, deep neural networks and many other applications. In addition, we show that the KL divergence bounds from above any separable Bregman divergence that is convex in its second argument (up to a multiplicative normalization constant). This result introduces a new set of divergence inequalities, similar to the well-known Pinsker inequality.en_US
dc.language.isoen
dc.publisherIEEEen_US
dc.relation.isversionof10.1109/isit.2018.8437786en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourcearXiven_US
dc.titleOn the Universality of the Logistic Loss Functionen_US
dc.typeArticleen_US
dc.identifier.citationPainsky, Amichai and Wornell, Gregory. 2018. "On the Universality of the Logistic Loss Function."
dc.eprint.versionOriginal manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2019-07-09T12:57:43Z
dspace.date.submission2019-07-09T12:57:44Z
mit.licenseOPEN_ACCESS_POLICY
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

VersionItemDateSummary

*Selected version