The Hessian Penalty: A Weak Prior for Unsupervised Disentanglement
Author(s)
Peebles, William; Peebles, John; Zhu, Jun-Yan; Efros, Alexei; Torralba, Antonio
DownloadAccepted version (12.80Mb)
Open Access Policy
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
Existing disentanglement methods for deep generative models rely on hand-picked priors and complex encoder-based architectures. In this paper, we propose the Hessian Penalty, a simple regularization term that encourages the Hessian of a generative model with respect to its input to be diagonal. We introduce a model-agnostic, unbiased stochastic approximation of this term based on Hutchinson’s estimator to compute it efficiently during training. Our method can be applied to a wide range of deep generators with just a few lines of code. We show that training with the Hessian Penalty often causes axis-aligned disentanglement to emerge in latent space when applied to ProGAN on several datasets. Additionally, we use our regularization term to identify interpretable directions in BigGAN’s latent space in an unsupervised fashion. Finally, we provide empirical evidence that the Hessian Penalty encourages substantial shrinkage when applied to over-parameterized latent spaces. We encourage readers to view videos of our disentanglement results at www.wpeebles.com/hessian-penalty, and code at https://github.com/wpeebles/hessian_penalty.
Description
Part of the Lecture Notes in Computer Science book series (LNCS, volume 12351)
Date issued
2020-11Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence LaboratoryJournal
Lecture Notes in Computer Science
Publisher
Springer International Publishing
Citation
Peebles, William et al. "The Hessian Penalty: A Weak Prior for Unsupervised Disentanglement."
ECCV: European Conference on Computer Vision, Lecture Notes in Computer Science, 12351, Springer International Publishing, 2020, 581-597. © 2020, Springer Nature
Version: Author's final manuscript
ISBN
9783030585389
9783030585396
ISSN
0302-9743
1611-3349