Gradient-Based Dimension Reduction of Multivariate Vector-Valued Functions
Author(s)
Zahm, Olivier; Constantine, Paul G; Prieur, Clémentine; Marzouk, Youssef M
DownloadPublished version (2.539Mb)
Publisher Policy
Publisher Policy
Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
Terms of use
Metadata
Show full item recordAbstract
© 2020 Society for Industrial and Applied Mathematics. Multivariate functions encountered in high-dimensional uncertainty quantification problems often vary most strongly along a few dominant directions in the input parameter space. We propose a gradient-based method for detecting these directions and using them to construct ridge approximations of such functions, in the case where the functions are vector-valued (e.g., taking values in Rn). The methodology consists of minimizing an upper bound on the approximation error, obtained by subspace Poincaré inequalities. We provide a thorough mathematical analysis in the case where the parameter space is equipped with a Gaussian probability measure. The resulting method generalizes the notion of active subspaces associated with scalar-valued functions. A numerical illustration shows that using gradients of the function yields effective dimension reduction. We also show how the choice of norm on the codomain of the function has an impact on the function's low-dimensional approximation.
Date issued
2020Department
Massachusetts Institute of Technology. Department of Aeronautics and AstronauticsJournal
SIAM Journal on Scientific Computing
Publisher
Society for Industrial & Applied Mathematics (SIAM)