Stable machine-learning parameterization of subgrid processes for climate modeling at a range of resolutions
Author(s)
Yuval, Janni; O'Gorman, Paul
DownloadPublished version (1017.Kb)
Publisher with Creative Commons License
Publisher with Creative Commons License
Creative Commons Attribution
Terms of use
Metadata
Show full item recordAbstract
Global climate models represent small-scale processes such as convection using subgrid models known as parameterizations, and these parameterizations contribute substantially to uncertainty in climate projections. Machine learning of new parameterizations from high-resolution model output is a promising approach, but such parameterizations have been prone to issues of instability and climate drift, and their performance for different grid spacings has not yet been investigated. Here we use a random forest to learn a parameterization from coarse-grained output of a three-dimensional high-resolution idealized atmospheric model. The parameterization leads to stable simulations at coarse resolution that replicate the climate of the high-resolution simulation. Retraining for different coarse-graining factors shows the parameterization performs best at smaller horizontal grid spacings. Our results yield insights into parameterization performance across length scales, and they also demonstrate the potential for learning parameterizations from global high-resolution simulations that are now emerging.
Date issued
2020-07Department
Massachusetts Institute of Technology. Department of Earth, Atmospheric, and Planetary SciencesJournal
Nature Communications
Publisher
Springer Science and Business Media LLC
Citation
Yuval, Janni and Paul A. O'Gorman. "Stable machine-learning parameterization of subgrid processes for climate modeling at a range of resolutions." Nature Communications 11, 1 (July 2020): 3295 © 2020 The Author(s)
Version: Final published version
ISSN
2041-1723