# On the Convergence Analysis of the Optimized Gradient Method

@article{Kim2017OnTC, title={On the Convergence Analysis of the Optimized Gradient Method}, author={Donghwan Kim and Jeffrey A. Fessler}, journal={Journal of Optimization Theory and Applications}, year={2017}, volume={172}, pages={187-205} }

This paper considers the problem of unconstrained minimization of smooth convex functions having Lipschitz continuous gradients with known Lipschitz constant. We recently proposed the optimized gradient method for this problem and showed that it has a worst-case convergence bound for the cost function decrease that is twice as small as that of Nesterov’s fast gradient method, yet has a similarly efficient practical implementation. Drori showed recently that the optimized gradient method has… Expand

#### 25 Citations

Adaptive Restart of the Optimized Gradient Method for Convex Optimization

- Mathematics, Computer Science
- J. Optim. Theory Appl.
- 2018

A new first-order method is derived that resembles the optimized gradient method for strongly convex quadratic problems with known function parameters, yielding a linear convergence rate that is faster than that of the analogous version of the fast gradient method. Expand

2 Problem and Methods 2 . 1 Smooth and Strongly Convex Problem

- 2018

First-order methods with momentum, such as Nesterov’s fast gradient method, are very useful for convex optimization problems, but can exhibit undesirable oscillations yielding slow convergence rates… Expand

Optimizing the Efficiency of First-Order Methods for Decreasing the Gradient of Smooth Convex Functions

- Computer Science, Mathematics
- J. Optim. Theory Appl.
- 2021

This paper optimizes the step coefficients of first-order methods for smooth convex minimization in terms of the worst-case convergence bound of the decrease in the gradient norm, and illustrates that the proposed method has a computationally efficient form that is similar to the optimized gradient method. Expand

Efficient first-order methods for convex minimization: a constructive approach

- Mathematics, Computer Science
- Math. Program.
- 2020

We describe a novel constructive technique for devising efficient first-order methods for a wide range of large-scale convex minimization settings, including smooth, non-smooth, and strongly convex… Expand

Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization

- Mathematics, Computer Science
- SIAM J. Optim.
- 2017

A new analytical worst-case guarantee is presented for the proximal point algorithm that is twice better than previously known, and the standard worst- case guarantee for the conditional gradient method is improved by more than a factor of two. Expand

Convex interpolation and performance estimation of first-order methods for convex optimization

- Computer Science, Mathematics
- 2017

This thesis forms a generic optimization problem looking for the worst-case scenarios of first-order methods in convex optimization, and transforms PEPs into solvable finite-dimensional semidefinite programs, from which one obtains worst- Case guarantees and worst- case functions, along with the corresponding explicit proofs. Expand

On the Optimal Ergodic Sublinear Convergence Rate of the Relaxed Proximal Point Algorithm for Variational Inequalities.

- Mathematics
- 2019

This paper investigates the optimal ergodic sublinear convergence rate of the relaxed proximal point algorithm for solving monotone variational inequality problems. The exact worst case convergence… Expand

Smooth strongly convex interpolation and exact worst-case performance of first-order methods

- Mathematics, Computer Science
- Math. Program.
- 2017

We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convex… Expand

An optimized first-order method for image restoration

- Mathematics, Computer Science
- 2015 IEEE International Conference on Image Processing (ICIP)
- 2015

The convergence analysis of OGM is discussed and its fast convergence on an image restoration problem using a smoothed total variation (TV) regularizer is explored and its extension to nonsmooth convex minimization for image restoration with l1-sparsity regularization is investigated. Expand

A frequency-domain analysis of inexact gradient methods

- Mathematics, Computer Science
- 2021

Improved analytic bounds for the convergence rate of Nesterov’s accelerated method on strongly convex functions are derived, based on frequency-domain criteria for the stability of nonlinear systems. Expand

#### References

SHOWING 1-10 OF 18 REFERENCES

Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization

- Mathematics, Computer Science
- SIAM J. Optim.
- 2017

A new analytical worst-case guarantee is presented for the proximal point algorithm that is twice better than previously known, and the standard worst- case guarantee for the conditional gradient method is improved by more than a factor of two. Expand

Performance of first-order methods for smooth convex minimization: a novel approach

- Computer Science, Mathematics
- Math. Program.
- 2014

A novel approach for analyzing the worst-case performance of first-order black-box optimization methods, which focuses on smooth unconstrained convex minimization over the Euclidean space and derives a new and tight analytical bound on its performance. Expand

Gradient methods for minimizing composite functions

- Mathematics, Computer Science
- Math. Program.
- 2013

In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two terms: one is smooth and given by a black-box oracle, and another is… Expand

Smooth strongly convex interpolation and exact worst-case performance of first-order methods

- Mathematics, Computer Science
- Math. Program.
- 2017

We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convex… Expand

Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints

- Computer Science, Mathematics
- SIAM J. Optim.
- 2016

A new framework to analyze and design iterative optimization algorithms built on the notion of Integral Quadratic Constraints (IQC) from robust control theory is developed, proving new inequalities about convex functions and providing a version of IQC theory adapted for use by optimization researchers. Expand

Optimized first-order methods for smooth convex minimization

- Computer Science, Mathematics
- Math. Program.
- 2016

We introduce new optimized first-order methods for smooth unconstrained convex minimization. Drori and Teboulle (Math Program 145(1–2):451–482, 2014. doi:10.1007/s10107-013-0653-0) recently described… Expand

Introductory Lectures on Convex Optimization - A Basic Course

- Computer Science
- Applied Optimization
- 2004

It was in the middle of the 1980s, when the seminal paper by Kar markar opened a new epoch in nonlinear optimization, and it became more and more common that the new methods were provided with a complexity analysis, which was considered a better justification of their efficiency than computational experiments. Expand

Some methods of speeding up the convergence of iteration methods

- Mathematics
- 1964

Abstract For the solution of the functional equation P (x) = 0 (1) (where P is an operator, usually linear, from B into B, and B is a Banach space) iteration methods are generally used. These consist… Expand

A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems

- Mathematics, Computer Science
- SIAM J. Imaging Sci.
- 2009

A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically. Expand

The exact information-based complexity of smooth convex minimization

- Mathematics, Computer Science
- J. Complex.
- 2017

We obtain a new lower bound on the information-based complexity of first-order minimization of smooth and convex functions. We show that the bound matches the worst-case performance of the recently… Expand