Communication complexity of estimating correlations
Author(s)
Hadar, U.; Liu, J.; Polyanskiy, Yury; Shayevitz, O.
DownloadSubmitted version (333.6Kb)
Open Access Policy
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
© 2019 Association for Computing Machinery. We characterize the communication complexity of the following distributed estimation problem. Alice and Bob observe infinitely many iid copies of ρ-correlated unit-variance (Gaussian or ±1 binary) random variables, with unknown ρ ∈ [−1, 1]. By interactively exchanging k bits, Bob wants to produce an estimate ρ of ρ. We show that the best possible performance (optimized over interaction protocol Π and estimator ρ) satisfies infΠρ supρ E[|ρ − ρ|2] = k−1(2 ln12 +o(1)). Curiously, the number of samples in our achievability scheme is exponential in k; by contrast, a naive scheme exchanging k samples achieves the same Ω(1/k) rate but with a suboptimal prefactor. Our protocol achieving optimal performance is one-way (non-interactive). We also prove the Ω(1/k) bound even when ρ is restricted to any small open sub-interval of [−1, 1] (i.e. a local minimax lower bound). Our proof techniques rely on symmetric strong data-processing inequalities and various tensorization techniques from information-theoretic interactive common-randomness extraction. Our results also imply an Ω(n) lower bound on the information complexity of the Gap-Hamming problem, for which we show a direct information-theoretic proof.
Date issued
2019-06Department
Massachusetts Institute of Technology. Institute for Data, Systems, and Society; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer ScienceJournal
Proceedings of the Annual ACM Symposium on Theory of Computing
Publisher
Association for Computing Machinery (ACM)
Citation
2019. "Communication complexity of estimating correlations." Proceedings of the Annual ACM Symposium on Theory of Computing.
Version: Original manuscript