Show simple item record

dc.contributor.authorTan, Vincent Yan Fu
dc.contributor.authorAnandkumar, Animashree
dc.contributor.authorWillsky, Alan S.
dc.date.accessioned2011-04-06T13:48:59Z
dc.date.available2011-04-06T13:48:59Z
dc.date.issued2010-04
dc.date.submitted2008-09
dc.identifier.issn1053-587X
dc.identifier.otherINSPEC Accession Number: 11228720
dc.identifier.urihttp://hdl.handle.net/1721.1/62145
dc.description.abstractThe problem of learning tree-structured Gaussian graphical models from independent and identically distributed (i.i.d.) samples is considered. The influence of the tree structure and the parameters of the Gaussian distribution on the learning rate as the number of samples increases is discussed. Specifically, the error exponent corresponding to the event that the estimated tree structure differs from the actual unknown tree structure of the distribution is analyzed. Finding the error exponent reduces to a least-squares problem in the very noisy learning regime. In this regime, it is shown that the extremal tree structure that minimizes the error exponent is the star for any fixed set of correlation coefficients on the edges of the tree. If the magnitudes of all the correlation coefficients are less than 0.63, it is also shown that the tree structure that maximizes the error exponent is the Markov chain. In other words, the star and the chain graphs represent the hardest and the easiest structures to learn in the class of tree-structured Gaussian graphical models. This result can also be intuitively explained by correlation decay: pairs of nodes which are far apart, in terms of graph distance, are unlikely to be mistaken as edges by the maximum-likelihood estimator in the asymptotic regime.en_US
dc.description.sponsorshipUnited States. Air Force Office of Scientific Research (Grant FA9550-08-1-1080)en_US
dc.description.sponsorshipUnited States. Army Research Office (MURI Grant No. W911NF-06-1-0076)en_US
dc.description.sponsorshipMultidisciplinary University Research Initiative (MURI) (AFOSR Grant FA9550-06-1-0324)en_US
dc.description.sponsorshipSingapore. Agency for Science, Technology and Researchen_US
dc.language.isoen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.relation.isversionofhttp://dx.doi.org/10.1109/tsp.2010.2042478en_US
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.en_US
dc.sourceIEEEen_US
dc.titleLearning Gaussian Tree Models: Analysis of Error Exponents and Extremal Structuresen_US
dc.typeArticleen_US
dc.identifier.citationTan, V.Y.F., A. Anandkumar, and A.S. Willsky. “Learning Gaussian Tree Models: Analysis of Error Exponents and Extremal Structures.” Signal Processing, IEEE Transactions on 58.5 (2010): 2701-2714. © Copyright 20110 IEEEen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.departmentMassachusetts Institute of Technology. Laboratory for Information and Decision Systemsen_US
dc.contributor.approverWillsky, Alan S.
dc.contributor.mitauthorTan, Vincent Yan Fu
dc.contributor.mitauthorAnandkumar, Animashree
dc.contributor.mitauthorWillsky, Alan S.
dc.relation.journalIEEE Transactions on Signal Processingen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dspace.orderedauthorsTan, Vincent Y. F.; Anandkumar, Animashree; Willsky, Alan S.en
dc.identifier.orcidhttps://orcid.org/0000-0003-0149-5888
mit.licensePUBLISHER_POLICYen_US
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record