MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Strong Data Processing Inequalities for Input Constrained Additive Noise Channels

Author(s)
Calmon, Flavio du Pin; Polyanskiy, Yury; Wu, Yihong
Thumbnail
DownloadSubmitted version (331.5Kb)
Open Access Policy

Open Access Policy

Creative Commons Attribution-Noncommercial-Share Alike

Terms of use
Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/
Metadata
Show full item record
Abstract
© 1963-2012 IEEE. This paper quantifies the intuitive observation that adding noise reduces available information by means of nonlinear strong data processing inequalities. Consider the random variables $W\to X\to Y$ forming a Markov chain, where $Y = X + Z$ with $X$ and $Z$ real valued, independent and $X$ bounded in $L-{p}$-norm. It is shown that $I(W; Y) \le F-{I}(I(W;X))$ with $F-{I}(t) < t$ whenever $t > 0$ , if and only if $Z$ has a density whose support is not disjoint from any translate of itself. A related question is to characterize for what couplings $(W, X)$ the mutual information $I(W; Y)$ is close to maximum possible. To that end we show that in order to saturate the channel, i.e., for $I(W; Y)$ to approach capacity, it is mandatory that $I(W; X)\to \infty $ (under suitable conditions on the channel). A key ingredient for this result is a deconvolution lemma which shows that postconvolution total variation distance bounds the preconvolution Kolmogorov-Smirnov distance. Explicit bounds are provided for the special case of the additive Gaussian noise channel with quadratic cost constraint. These bounds are shown to be order optimal. For this case, simplified proofs are provided leveraging Gaussian-specific tools such as the connection between information and estimation (I-MMSE) and Talagrand's information-Transportation inequality.
Date issued
2018-03
URI
https://hdl.handle.net/1721.1/137039
Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Citation
Calmon, Flavio du Pin, Polyanskiy, Yury and Wu, Yihong. 2018. "Strong Data Processing Inequalities for Input Constrained Additive Noise Channels." 64 (3).
Version: Original manuscript
ISSN
0018-9448
1557-9654

Collections
  • MIT Open Access Articles

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.