Divergence Covering
Author(s)
Tang, Jennifer
DownloadThesis PDF (3.998Mb)
Advisor
Polyanskiy, Yury
Terms of use
Metadata
Show full item recordAbstract
A longstanding problem of interest is that of finding covering numbers. A very important measure between probability distributions is Kullback-Leibler (KL) divergence. Both topics have been massively studied in various contexts, and in this thesis we focus on studying the problem when the two concepts are combined. This combination yields interesting techniques for providing useful bounds on a number of important problems related to information theory. Our goal is to explore covering the probability simplex in terms of KL divergence. Various properties of KL divergence (e.g. it is not a metric, not symmetric, and can easily blow up to infinity) make it unintuitive and difficult to analyze using traditional methods. We look at covering discrete large-alphabet probabilities both with worst-case divergence distance and average-case divergence distance and examine the implications of these divergence covering numbers. One implication of worst-case divergence covering is finding how to communicate probability distributions with limited communication bandwidth. Another implication is in universal compression and universal prediction, where the divergence covering number provides upper bounds on minimax risk. A third application is computing capacity of the noisy permutation channel. We then use average-case divergence covering to study efficient algorithms for quantizing large-alphabet distributions in order to save storage space.
Date issued
2022-02Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer SciencePublisher
Massachusetts Institute of Technology