Acceleration of optimization algorithms

Juan Maulen Muñoz

Research output: ThesisThesis fully internal (DIV)

261 Downloads (Pure)

Abstract

This thesis is focused on studying the convergence and performance of optimization algorithms focusing on two techniques: a restart scheme for a continuous dynamics, and the inclusion of inertia on Krasnoselskii-Mann iterations.

The first part contains a result on the convergence of a restarting scheme for a convex function through the solution trajectories of a differential equation with curvature-defined damping. This particular dynamics can be interpreted as the continuous setting for Nesterov's accelerated method with a term that depends on the Hessian of the function. The results can be interpreted as an extension of the restart scheme proposed by Su, Boyd and Candès in 2016. Numerical experiments are displayed, showing the performance of the restart routine and an existence theorem for the solutions of the differential equation.

The second part focuses on the inclusion of inertia on Krasnoselskii-Mann iterations, governed by a family of quasi-nonexpansive operators defined over a Hilbert space. Krasnoselskii-Mann iterations can be seen as a relaxed setting for Banach-Picard iterations, and under standard hypotheses, they converge towards a common fixed point of the family of operators. Fixed point iterations play a central role on defining optimization algorithms for a wide range of problems. In particular, results on the weak and strong convergence of the iterations are provided, depending on the hypotheses assumed over the family of operators. Numerical illustrations are displayed for two new inertial algorithms, whose performance is superior to that of their non-inertial counterparts.
Original languageEnglish
QualificationDoctor of Philosophy
Awarding Institution
  • University of Groningen
Supervisors/Advisors
  • Peypouquet, Juan, Supervisor
  • Daniilidiis, A. , Supervisor, External person
Award date12-Dec-2023
Place of Publication[Groningen]
Publisher
DOIs
Publication statusPublished - 2023

Fingerprint

Dive into the research topics of 'Acceleration of optimization algorithms'. Together they form a unique fingerprint.

Cite this