Show simple item record

dc.contributor.authorGaitonde, Jason
dc.contributor.authorMossel, Elchanan
dc.date.accessioned2024-07-19T15:36:34Z
dc.date.available2024-07-19T15:36:34Z
dc.date.issued2024-06-10
dc.identifier.isbn979-8-4007-0383-6
dc.identifier.urihttps://hdl.handle.net/1721.1/155719
dc.descriptionSTOC ’24, June 24–28, 2024, Vancouver, BC, Canadaen_US
dc.description.abstractWe revisit the well-studied problem of e ciently learning the underlying structure and parameters of an Ising model from data. Current algorithmic approaches achieve essentially optimal sample complexity when samples are generated i.i.d. from the stationary measure and the underlying model satis es “width” constraints that bound the total ℓ1 interaction involving each node. However, these assumptions are not satis ed in some important settings of interest, like temporally correlated data or more complicated models (like spin glasses) that do not satisfy width bounds. We analyze a simple existing approach based on node-wise logistic regression, and show it provably succeeds at e ciently recovering the underlying Ising model in several new settings: (1) Given dynamically generated data from a wide variety of Markov chains, including Glauber, block, and round-robin dynamics, logistic regression recovers the parameters with sample complexity that is optimal up to log log factors. This generalizes the specialized algorithm of Bresler, Gamarnik, and Shah (IEEE Trans. Inf. Theory ’18) for structure recovery in bounded degree graphs from Glauber dynamics. (2) For the Sherrington-Kirkpatrick model of spin glasses, given poly() independent samples, logistic regression recovers the parameters in most of the proven high-temperature regime via a simple reduction to weaker structural properties of the measure. This improves on recent work of Anari, Jain, Koehler, Pham, and Vuong (SODA ’24) which gives distribution learning at higher temperature. (3) As a simple byproduct of our techniques, logistic regression achieves an exponential improvement in learning from samples in the M-regime of data considered by Dutt, Lokhov, Vu ray, and Misra (ICML ’21) as well as novel guarantees for learning from the adversarial Glauber dynamics of Chin, Moitra, Mossel, and Sandon. Our approach thus provides a signi cant generalization of the elegant analysis of logistic regression by Wu, Sanghavi, and Dimakis (Neurips ’19) without any algorithmic modi cation in each settingen_US
dc.publisherACM|STOC 2024: Proceedings of the 56th Annual ACM Symposium on Theory of Computingen_US
dc.relation.isversionof10.1145/3618260.3649674en_US
dc.rightsCreative Commons Attributionen_US
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/en_US
dc.sourceAssociation for Computing Machineryen_US
dc.titleA Unified Approach to Learning Ising Models: Beyond Independence and Bounded Widthen_US
dc.typeArticleen_US
dc.identifier.citationGaitonde, Jason and Mossel, Elchanan. 2024. "A Unified Approach to Learning Ising Models: Beyond Independence and Bounded Width."
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mathematics
dc.identifier.mitlicensePUBLISHER_CC
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2024-07-01T07:49:02Z
dc.language.rfc3066en
dc.rights.holderThe author(s)
dspace.date.submission2024-07-01T07:49:02Z
mit.licensePUBLISHER_CC
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record