Universal Gradient Methods in Nonlinear Optimization
a day ago
- #nonlinear
- #gradient-methods
- #optimization
- Universal first-order methods for Composite Optimization with new complexity analysis.
- Provides universal convergence guarantees not directly linked to any parametric problem class.
- Convergence rates for specific problem classes via Global Curvature Bound substitution.
- Analyzes simple gradient method for nonconvex minimization and convex composite optimization.
- Accelerated variant ensures best possible convergence rate across all parametric problem classes.
- Input parameter is the required accuracy of the approximate solution.