MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Riemannian Adaptive Regularized Newton Methods with Hölder Continuous Hessians

Author(s)
Zhang, Chenyu; Jiang, Rujun
Download10589_2025_692_ReferencePDF.pdf (Embargoed until: 2026-05-21, 1.562Mb)
Open Access Policy

Open Access Policy

Creative Commons Attribution-Noncommercial-Share Alike

Terms of use
Creative Commons Attribution-Noncommercial-ShareAlike http://creativecommons.org/licenses/by-nc-sa/4.0/
Metadata
Show full item record
Abstract
This paper presents strong worst-case iteration and operation complexity guarantees for Riemannian adaptive regularized Newton methods, a unified framework encompassing both Riemannian adaptive regularization (RAR) methods and Riemannian trust region (RTR) methods. We comprehensively characterize the sources of approximation in second-order manifold optimization methods: the objective function’s smoothness, retraction’s smoothness, and subproblem solver’s inexactness. Specifically, for a function with a μ -Hölder continuous Hessian, when equipped with a retraction featuring a ν -Hölder continuous differential and a θ -inexact subproblem solver, both RTR and RAR with 2 + α regularization (where α = min { μ , ν , θ } ) locate an ( ϵ , ϵ α / ( 1 + α ) ) -approximate second-order stationary point within at most O ( ϵ - ( 2 + α ) / ( 1 + α ) ) iterations and at most O ~ ( ϵ - ( 4 + 3 α ) / ( 2 ( 1 + α ) ) ) Hessian-vector products with high probability. These complexity results are novel and sharp, and reduce to an iteration complexity of O ( ϵ - 3 / 2 ) and an operation complexity of O ~ ( ϵ - 7 / 4 ) when α = 1 .
Date issued
2025-05-21
URI
https://hdl.handle.net/1721.1/163086
Department
Massachusetts Institute of Technology. Institute for Data, Systems, and Society; Massachusetts Institute of Technology. Laboratory for Information and Decision Systems
Journal
Computational Optimization and Applications
Publisher
Springer US
Citation
Zhang, C., Jiang, R. Riemannian Adaptive Regularized Newton Methods with Hölder Continuous Hessians. Comput Optim Appl 92, 29–79 (2025).
Version: Author's final manuscript

Collections
  • MIT Open Access Articles

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.