## Abstract

In a Hilbert space setting H, we study the fast convergence properties as t→ + ∞ of the trajectories of the second-order differential equation x¨(t)+αtx˙(t)+∇Φ(x(t))=g(t),where ∇ Φ is the gradient of a convex continuously differentiable function Φ : H→ R, α is a positive parameter, and g: [ t_{0}, + ∞[ → H is a small perturbation term. In this inertial system, the viscous damping coefficient αt vanishes asymptotically, but not too rapidly. For α≥ 3 , and ∫t0+∞t‖g(t)‖dt<+∞, just assuming that argmin, we show that any trajectory of the above system satisfies the fast convergence property Φ(x(t))-minHΦ≤Ct2.Moreover, for α> 3 , any trajectory converges weakly to a minimizer of Φ. The strong convergence is established in various practical situations. These results complement the O(t^{- 2}) rate of convergence for the values obtained by Su, Boyd and Candès in the unperturbed case g= 0. Time discretization of this system, and some of its variants, provides new fast converging algorithms, expanding the field of rapid methods for structured convex minimization introduced by Nesterov, and further developed by Beck and Teboulle with FISTA. This study also complements recent advances due to Chambolle and Dossal.

Original language | English |
---|---|

Pages (from-to) | 123-175 |

Number of pages | 53 |

Journal | Mathematical Programming |

Volume | 168 |

Issue number | 1-2 |

DOIs | |

Publication status | Published - 1-Mar-2018 |

Externally published | Yes |

## Keywords

- Convex optimization
- Dynamical systems
- Fast convergent methods
- Gradient flows
- Inertial dynamics
- Nesterov method
- Vanishing viscosity