Show simple item record

dc.contributor.authorGoldfeld, Ziv
dc.contributor.authorGreenewald, Kristjan
dc.contributor.authorWeed, Jonathan
dc.contributor.authorPolyanskiy, Yury
dc.date.accessioned2021-11-01T18:45:11Z
dc.date.available2021-11-01T18:45:11Z
dc.date.issued2019-09
dc.identifier.urihttps://hdl.handle.net/1721.1/137042
dc.description.abstract© 2019 IEEE. This paper establishes the optimality of the plugin estimator for the problem of differential entropy estimation under Gaussian convolutions. Specifically, we consider the estimation of the differential entropy h(X + Z), where X and Z are independent d-dimensional random variables with Z{\sim}\mathcal{N}( {0,{σ ^2}{{\text{I}}-d}} ). The distribution of X is unknown and belongs to some nonparametric class, but n independently and identically distributed samples from it are available. We first show that despite the regularizing effect of noise, any good estimator (within an additive gap) for this problem must have an exponential in d sample complexity. We then analyze the absolute-error risk of the plug-in estimator and show that it converges as frac{{{c^d}}}{{n }}, thus attaining the parametric estimation rate. This implies the optimality of the plug-in estimator for the considered problem. We provide numerical results comparing the performance of the plug-in estimator to general-purpose (unstructured) differential entropy estimators (based on kernel density estimation (KDE) or k nearest neighbors (kNN) techniques) applied to samples of X + Z. These results reveal a significant empirical superiority of the plug-in to state-of-the-art KDE- and kNN-based methods.en_US
dc.language.isoen
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.relation.isversionofhttp://dx.doi.org/10.1109/ISIT.2019.8849414en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourceMIT web domainen_US
dc.titleOptimality of the Plug-in Estimator for Differential Entropy Estimation under Gaussian Convolutionsen_US
dc.typeArticleen_US
dc.identifier.citationGoldfeld, Ziv, Greenewald, Kristjan, Weed, Jonathan and Polyanskiy, Yury. 2019. "Optimality of the Plug-in Estimator for Differential Entropy Estimation under Gaussian Convolutions." IEEE International Symposium on Information Theory - Proceedings, 2019-July.
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratoryen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.relation.journalIEEE International Symposium on Information Theory - Proceedingsen_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2021-04-15T16:05:35Z
dspace.orderedauthorsGoldfeld, Z; Greenewald, K; Weed, J; Polyanskiy, Yen_US
dspace.date.submission2021-04-15T16:05:36Z
mit.journal.volume2019-Julyen_US
mit.licenseOPEN_ACCESS_POLICY
mit.metadata.statusPublication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record