Riemannian Adaptive Regularized Newton Methods with Hölder Continuous Hessians
Author(s)
Zhang, Chenyu; Jiang, Rujun
Download10589_2025_692_ReferencePDF.pdf (Embargoed until: 2026-05-21, 1.562Mb)
Open Access Policy
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
This paper presents strong worst-case iteration and operation complexity guarantees for Riemannian adaptive regularized Newton methods, a unified framework encompassing both Riemannian adaptive regularization (RAR) methods and Riemannian trust region (RTR) methods. We comprehensively characterize the sources of approximation in second-order manifold optimization methods: the objective function’s smoothness, retraction’s smoothness, and subproblem solver’s inexactness. Specifically, for a function with a μ -Hölder continuous Hessian, when equipped with a retraction featuring a ν -Hölder continuous differential and a θ -inexact subproblem solver, both RTR and RAR with 2 + α regularization (where α = min { μ , ν , θ } ) locate an ( ϵ , ϵ α / ( 1 + α ) ) -approximate second-order stationary point within at most O ( ϵ - ( 2 + α ) / ( 1 + α ) ) iterations and at most O ~ ( ϵ - ( 4 + 3 α ) / ( 2 ( 1 + α ) ) ) Hessian-vector products with high probability. These complexity results are novel and sharp, and reduce to an iteration complexity of O ( ϵ - 3 / 2 ) and an operation complexity of O ~ ( ϵ - 7 / 4 ) when α = 1 .
Date issued
2025-05-21Department
Massachusetts Institute of Technology. Institute for Data, Systems, and Society; Massachusetts Institute of Technology. Laboratory for Information and Decision SystemsJournal
Computational Optimization and Applications
Publisher
Springer US
Citation
Zhang, C., Jiang, R. Riemannian Adaptive Regularized Newton Methods with Hölder Continuous Hessians. Comput Optim Appl 92, 29–79 (2025).
Version: Author's final manuscript