Information-Distilling Quantizers
Author(s)
Bhatt, Alankrita; Nazer, Bobak; Ordentlich, Or; Polyanskiy, Yury
DownloadSubmitted version (383.2Kb)
Open Access Policy
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
IEEE Let X and Y be dependent random variables. This paper considers the problem of designing a scalar quantizer for Y to maximize the mutual information between the quantizer’s output and X, and develops fundamental properties and bounds for this form of quantization, which is connected to the log-loss distortion criterion. The main focus is the regime of low I(X; Y ), where it is shown that, if X is binary, a constant fraction of the mutual information can always be preserved using O(log(1/I(X; Y ))) quantization levels, and there exist distributions for which this many quantization levels are necessary. Furthermore, for larger finite alphabets 2 < |X| < ∞, it is established that an η-fraction of the mutual information can be preserved using roughly (log(|X|/I(X; Y )))η·(|X|-1) quantization levels.
Date issued
2021Department
Massachusetts Institute of Technology. Laboratory for Information and Decision Systems; Statistics and Data Science Center (Massachusetts Institute of Technology)Journal
IEEE Transactions on Information Theory
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Citation
Bhatt, Alankrita, Nazer, Bobak, Ordentlich, Or and Polyanskiy, Yury. 2021. "Information-Distilling Quantizers." IEEE Transactions on Information Theory, 67 (4).
Version: Original manuscript