Show simple item record

dc.contributor.authorLiu, Ying
dc.contributor.authorWillsky, Alan S.
dc.date.accessioned2015-02-06T14:25:03Z
dc.date.available2015-02-06T14:25:03Z
dc.date.issued2013-12
dc.identifier.isbn978-1-63266-024-4
dc.identifier.urihttp://hdl.handle.net/1721.1/93883
dc.description.abstractGaussian Graphical Models (GGMs) or Gauss Markov random fields are widely used in many applications, and the trade-off between the modeling capacity and the efficiency of learning and inference has been an important research problem. In this paper, we study the family of GGMs with small feedback vertex sets (FVSs), where an FVS is a set of nodes whose removal breaks all the cycles. Exact inference such as computing the marginal distributions and the partition function has complexity O(k[superscript 2]n) using message-passing algorithms, where k is the size of the FVS, and n is the total number of nodes. We propose efficient structure learning algorithms for two cases: 1) All nodes are observed, which is useful in modeling social or flight networks where the FVS nodes often correspond to a small number of highly influential nodes, or hubs, while the rest of the networks is modeled by a tree. Regardless of the maximum degree, without knowing the full graph structure, we can exactly compute the maximum likelihood estimate with complexity O(kn[superscript 2] + n[superscript 2] log n) if the FVS is known or in polynomial time if the FVS is unknown but has bounded size. 2) The FVS nodes are latent variables, where structure learning is equivalent to decomposing an inverse covariance matrix (exactly or approximately) into the sum of a tree-structured matrix and a low-rank matrix. By incorporating efficient inference into the learning steps, we can obtain a learning algorithm using alternating low-rank corrections with complexity O(kn[superscript 2] + n[superscript 2] log n) per iteration. We perform experiments using both synthetic data as well as real data of flight delays to demonstrate the modeling capacity with FVSs of various sizes.en_US
dc.description.sponsorshipUnited States. Air Force Office of Scientific Research (Grant FA9550-12-1-0287)en_US
dc.language.isoen_US
dc.publisherNeural Information Processing Systems Foundationen_US
dc.relation.isversionofhttp://toc.proceedings.com/21521webtoc.pdfen_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourceProf. Willsky via Chris Sherratten_US
dc.titleLearning Gaussian Graphical Models with Observed or Latent FVSsen_US
dc.typeArticleen_US
dc.identifier.citationLiu, Ying, and Alan S. Willsky. "Learning Gaussian Graphical Models with Observed or Latent FVSs." Advances in Neural Information Processing Systems 26 (December 2013).en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.departmentMassachusetts Institute of Technology. Laboratory for Information and Decision Systemsen_US
dc.contributor.approverWillsky, Alan S.en_US
dc.contributor.mitauthorLiu, Yingen_US
dc.contributor.mitauthorWillsky, Alan S.en_US
dc.relation.journalAdvances in Neural Information Processing Systems (NIPS) 26en_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dspace.orderedauthorsLiu, Ying; Willsky, Alan S.en_US
dc.identifier.orcidhttps://orcid.org/0000-0003-0149-5888
mit.licenseOPEN_ACCESS_POLICYen_US
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record