| dc.contributor.author | Calmon, Flavio du Pin | |
| dc.contributor.author | Polyanskiy, Yury | |
| dc.contributor.author | Wu, Yihong | |
| dc.date.accessioned | 2021-11-01T18:42:18Z | |
| dc.date.available | 2021-11-01T18:42:18Z | |
| dc.date.issued | 2018-03 | |
| dc.identifier.issn | 0018-9448 | |
| dc.identifier.issn | 1557-9654 | |
| dc.identifier.uri | https://hdl.handle.net/1721.1/137039 | |
| dc.description.abstract | © 1963-2012 IEEE. This paper quantifies the intuitive observation that adding noise reduces available information by means of nonlinear strong data processing inequalities. Consider the random variables $W\to X\to Y$ forming a Markov chain, where $Y = X + Z$ with $X$ and $Z$ real valued, independent and $X$ bounded in $L-{p}$-norm. It is shown that $I(W; Y) \le F-{I}(I(W;X))$ with $F-{I}(t) < t$ whenever $t > 0$ , if and only if $Z$ has a density whose support is not disjoint from any translate of itself. A related question is to characterize for what couplings $(W, X)$ the mutual information $I(W; Y)$ is close to maximum possible. To that end we show that in order to saturate the channel, i.e., for $I(W; Y)$ to approach capacity, it is mandatory that $I(W; X)\to \infty $ (under suitable conditions on the channel). A key ingredient for this result is a deconvolution lemma which shows that postconvolution total variation distance bounds the preconvolution Kolmogorov-Smirnov distance. Explicit bounds are provided for the special case of the additive Gaussian noise channel with quadratic cost constraint. These bounds are shown to be order optimal. For this case, simplified proofs are provided leveraging Gaussian-specific tools such as the connection between information and estimation (I-MMSE) and Talagrand's information-Transportation inequality. | en_US |
| dc.language.iso | en | |
| dc.publisher | Institute of Electrical and Electronics Engineers (IEEE) | en_US |
| dc.relation.isversionof | 10.1109/tit.2017.2782359 | en_US |
| dc.rights | Creative Commons Attribution-Noncommercial-Share Alike | en_US |
| dc.rights.uri | http://creativecommons.org/licenses/by-nc-sa/4.0/ | en_US |
| dc.source | MIT web domain | en_US |
| dc.title | Strong Data Processing Inequalities for Input Constrained Additive Noise Channels | en_US |
| dc.type | Article | en_US |
| dc.identifier.citation | Calmon, Flavio du Pin, Polyanskiy, Yury and Wu, Yihong. 2018. "Strong Data Processing Inequalities for Input Constrained Additive Noise Channels." 64 (3). | |
| dc.contributor.department | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science | |
| dc.eprint.version | Original manuscript | en_US |
| dc.type.uri | http://purl.org/eprint/type/JournalArticle | en_US |
| eprint.status | http://purl.org/eprint/status/NonPeerReviewed | en_US |
| dc.date.updated | 2019-07-01T18:08:09Z | |
| dspace.date.submission | 2019-07-01T18:08:10Z | |
| mit.journal.volume | 64 | en_US |
| mit.journal.issue | 3 | en_US |
| mit.license | OPEN_ACCESS_POLICY | |
| mit.metadata.status | Authority Work and Publication Information Needed | en_US |