Show simple item record

dc.contributor.authorTan, Vincent Yan Fu
dc.contributor.authorAnandkumar, Animashree
dc.contributor.authorWillsky, Alan S.
dc.date.accessioned2011-04-19T19:26:48Z
dc.date.available2011-04-19T19:26:48Z
dc.date.issued2010-01
dc.date.submitted2009-09
dc.identifier.isbn978-1-4244-5870-7
dc.identifier.isbn978-1-4244-5871-4
dc.identifier.otherINSPEC Accession Number: 11135184
dc.identifier.urihttp://hdl.handle.net/1721.1/62239
dc.description.abstractThe problem of learning tree-structured Gaussian graphical models from i.i.d. samples is considered. The influence of the tree structure and the parameters of the Gaussian distribution on the learning rate as the number of samples increases is discussed. Specifically, the error exponent corresponding to the event that the estimated tree structure differs from the actual unknown tree structure of the distribution is analyzed. Finding the error exponent reduces to a least-squares problem in the very noisy learning regime. In this regime, it is shown that universally, the extremal tree structures which maximize and minimize the error exponent are the star and the Markov chain for any fixed set of correlation coefficients on the edges of the tree. In other words, the star and the chain graphs represent the hardest and the easiest structures to learn in the class of tree-structured Gaussian graphical models. This result can also be intuitively explained by correlation decay: pairs of nodes which are far apart, in terms of graph distance, are unlikely to be mistaken as edges by the maximum-likelihood estimator in the asymptotic regime.en_US
dc.description.sponsorshipUnited States. Air Force Office of Scientific Research (Grant FA9550-08-1-1080)en_US
dc.description.sponsorshipUnited States. Army Research Office (MURI funded through ARO Grant W911NF-06-1-0076)en_US
dc.description.sponsorshipUnited States. Air Force Office of Scientific Research (MURI Grant FA9550-06-1-0324)en_US
dc.description.sponsorshipSingapore. Agency for Science, Technology and Researchen_US
dc.language.isoen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.relation.isversionofhttp://dx.doi.org/10.1109/ALLERTON.2009.5394929en_US
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.en_US
dc.sourceIEEEen_US
dc.titleHow do the structure and the parameters of Gaussian tree models affect structure learning?en_US
dc.typeArticleen_US
dc.identifier.citationTan, Vincent Y. F., Animashree Anandkumar, and Alan S. Willsky. “How Do the Structure and the Parameters of Gaussian Tree Models Affect Structure Learning?” Communication, Control, and Computing, 2009. Allerton 2009. 47th Annual Allerton Conference On. 2009. 684-691. © 2010 IEEE.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.departmentMassachusetts Institute of Technology. Laboratory for Information and Decision Systemsen_US
dc.contributor.departmentMassachusetts Institute of Technology. Stochastic Systems Groupen_US
dc.contributor.approverWillsky, Alan S.
dc.contributor.mitauthorWillsky, Alan S.
dc.contributor.mitauthorTan, Vincent Yan Fu
dc.contributor.mitauthorAnandkumar, Animashree
dc.relation.journal47th Annual Allerton Conference on Communication, Control, and Computing, 2009. Allerton 2009en_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
dspace.orderedauthorsTan, Vincent Y. F.; Anandkumar, Animashree; Willsky, Alan S.en
dc.identifier.orcidhttps://orcid.org/0000-0003-0149-5888
mit.licensePUBLISHER_POLICYen_US
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record