Learning Sparse Gaussian Graphical Model with l0-regularization
Author(s)
Mu, Beipeng; How, Jonathan
Downloadmain article (461.0Kb)
Metadata
Show full item recordAbstract
For the problem of learning sparse Gaussian graphical models, it is desirable to obtain both sparse structures as well as good parameter estimates. Classical techniques, such as optimizing the l1-regularized maximum likelihood or Chow-Liu algorithm, either focus on parameter estimation or constrain to speci c structure. This paper proposes an alternative that is based on l0-regularized maximum likelihood and employs a greedy algorithm to solve the optimization problem. We show that, when the graph is acyclic, the greedy solution finds the optimal acyclic graph. We also show it can update the parameters in constant time when connecting two sub-components, thus work efficiently on sparse graphs. Empirical results are provided to demonstrate this new algorithm can learn sparse structures with cycles efficiently and that it dominates l1-regularized approach on graph likelihood.
Date issued
2014-08-22Keywords
Gaussian Graphical Models, l0 regularization