Stable machine-learning parameterization of subgrid processes for climate modeling at a range of resolutions
Author(s)Yuval, Janni; O'Gorman, Paul
MetadataShow full item record
Global climate models represent small-scale processes such as convection using subgrid models known as parameterizations, and these parameterizations contribute substantially to uncertainty in climate projections. Machine learning of new parameterizations from high-resolution model output is a promising approach, but such parameterizations have been prone to issues of instability and climate drift, and their performance for different grid spacings has not yet been investigated. Here we use a random forest to learn a parameterization from coarse-grained output of a three-dimensional high-resolution idealized atmospheric model. The parameterization leads to stable simulations at coarse resolution that replicate the climate of the high-resolution simulation. Retraining for different coarse-graining factors shows the parameterization performs best at smaller horizontal grid spacings. Our results yield insights into parameterization performance across length scales, and they also demonstrate the potential for learning parameterizations from global high-resolution simulations that are now emerging.
DepartmentMassachusetts Institute of Technology. Department of Earth, Atmospheric, and Planetary Sciences
Springer Science and Business Media LLC
Yuval, Janni and Paul A. O'Gorman. "Stable machine-learning parameterization of subgrid processes for climate modeling at a range of resolutions." Nature Communications 11, 1 (July 2020): 3295 © 2020 The Author(s)
Final published version