Show simple item record

dc.contributor.authorCalmon, Flavio du Pin
dc.contributor.authorPolyanskiy, Yury
dc.contributor.authorWu, Yihong
dc.date.accessioned2021-11-01T18:42:18Z
dc.date.available2021-11-01T18:42:18Z
dc.date.issued2018-03
dc.identifier.issn0018-9448
dc.identifier.issn1557-9654
dc.identifier.urihttps://hdl.handle.net/1721.1/137039
dc.description.abstract© 1963-2012 IEEE. This paper quantifies the intuitive observation that adding noise reduces available information by means of nonlinear strong data processing inequalities. Consider the random variables $W\to X\to Y$ forming a Markov chain, where $Y = X + Z$ with $X$ and $Z$ real valued, independent and $X$ bounded in $L-{p}$-norm. It is shown that $I(W; Y) \le F-{I}(I(W;X))$ with $F-{I}(t) < t$ whenever $t > 0$ , if and only if $Z$ has a density whose support is not disjoint from any translate of itself. A related question is to characterize for what couplings $(W, X)$ the mutual information $I(W; Y)$ is close to maximum possible. To that end we show that in order to saturate the channel, i.e., for $I(W; Y)$ to approach capacity, it is mandatory that $I(W; X)\to \infty $ (under suitable conditions on the channel). A key ingredient for this result is a deconvolution lemma which shows that postconvolution total variation distance bounds the preconvolution Kolmogorov-Smirnov distance. Explicit bounds are provided for the special case of the additive Gaussian noise channel with quadratic cost constraint. These bounds are shown to be order optimal. For this case, simplified proofs are provided leveraging Gaussian-specific tools such as the connection between information and estimation (I-MMSE) and Talagrand's information-Transportation inequality.en_US
dc.language.isoen
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.relation.isversionof10.1109/tit.2017.2782359en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourceMIT web domainen_US
dc.titleStrong Data Processing Inequalities for Input Constrained Additive Noise Channelsen_US
dc.typeArticleen_US
dc.identifier.citationCalmon, Flavio du Pin, Polyanskiy, Yury and Wu, Yihong. 2018. "Strong Data Processing Inequalities for Input Constrained Additive Noise Channels." 64 (3).
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
dc.eprint.versionOriginal manuscripten_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2019-07-01T18:08:09Z
dspace.date.submission2019-07-01T18:08:10Z
mit.journal.volume64en_US
mit.journal.issue3en_US
mit.licenseOPEN_ACCESS_POLICY
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record