Principal differences analysis: Interpretable characterization of differences between distributions
Author(s)Mueller, Jonas Weylin; Jaakkola, Tommi S
MetadataShow full item record
We introduce principal differences analysis (PDA) for analyzing differences between high-dimensional distributions. The method operates by finding the projection that maximizes the Wasserstein divergence between the resulting univariate populations. Relying on the Cramer-Wold device, it requires no assumptions about the form of the underlying distributions, nor the nature of their inter-class differences. A sparse variant of the method is introduced to identify features responsible for the differences. We provide algorithms for both the original minimax formulation as well as its semidefinite relaxation. In addition to deriving some convergence results, we illustrate how the approach may be applied to identify differences between cell populations in the somatosensory cortex and hippocampus as manifested by single cell RNA-seq. Our broader framework extends beyond the specific choice of Wasserstein divergence.
DepartmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Advances in Neural Information Processing Systems 28 (NIPS 2015)
Neural Information Processing Systems Foundation, Inc.
Mueller, Jonas and Tommi Jaakkola. "Principal Differences Analysis: Interpretable Characterization of Differences between Distributions." Advances in Neural Information Processing Systems 28 (NIPS 2015), 7-12 December, 2015, Montreal Canada, NIPS, 2015.
Final published version