A Unified Approach to Learning Ising Models: Beyond Independence and Bounded Width
Author(s)
Gaitonde, Jason; Mossel, Elchanan
Download3618260.3649674.pdf (299.0Kb)
Publisher with Creative Commons License
Publisher with Creative Commons License
Creative Commons Attribution
Terms of use
Metadata
Show full item recordAbstract
We revisit the well-studied problem of e ciently learning the underlying structure and parameters of an Ising model from data.
Current algorithmic approaches achieve essentially optimal sample
complexity when samples are generated i.i.d. from the stationary
measure and the underlying model satis es “width” constraints that
bound the total ℓ1 interaction involving each node. However, these
assumptions are not satis ed in some important settings of interest,
like temporally correlated data or more complicated models (like
spin glasses) that do not satisfy width bounds.
We analyze a simple existing approach based on node-wise logistic regression, and show it provably succeeds at e ciently recovering the underlying Ising model in several new settings:
(1) Given dynamically generated data from a wide variety of
Markov chains, including Glauber, block, and round-robin
dynamics, logistic regression recovers the parameters with
sample complexity that is optimal up to log log factors. This
generalizes the specialized algorithm of Bresler, Gamarnik,
and Shah (IEEE Trans. Inf. Theory ’18) for structure recovery
in bounded degree graphs from Glauber dynamics.
(2) For the Sherrington-Kirkpatrick model of spin glasses, given
poly() independent samples, logistic regression recovers
the parameters in most of the proven high-temperature
regime via a simple reduction to weaker structural properties of the measure. This improves on recent work of Anari,
Jain, Koehler, Pham, and Vuong (SODA ’24) which gives
distribution learning at higher temperature.
(3) As a simple byproduct of our techniques, logistic regression
achieves an exponential improvement in learning from samples in the M-regime of data considered by Dutt, Lokhov,
Vu ray, and Misra (ICML ’21) as well as novel guarantees
for learning from the adversarial Glauber dynamics of Chin,
Moitra, Mossel, and Sandon.
Our approach thus provides a signi cant generalization of the
elegant analysis of logistic regression by Wu, Sanghavi, and Dimakis
(Neurips ’19) without any algorithmic modi cation in each setting
Description
STOC ’24, June 24–28, 2024, Vancouver, BC, Canada
Date issued
2024-06-10Department
Massachusetts Institute of Technology. Department of MathematicsPublisher
ACM|STOC 2024: Proceedings of the 56th Annual ACM Symposium on Theory of Computing
Citation
Gaitonde, Jason and Mossel, Elchanan. 2024. "A Unified Approach to Learning Ising Models: Beyond Independence and Bounded Width."
Version: Final published version
ISBN
979-8-4007-0383-6
Collections
The following license files are associated with this item: