Abstract
This thesis is focused on studying the convergence and performance of optimization algorithms focusing on two techniques: a restart scheme for a continuous dynamics, and the inclusion of inertia on Krasnoselskii-Mann iterations.
The first part contains a result on the convergence of a restarting scheme for a convex function through the solution trajectories of a differential equation with curvature-defined damping. This particular dynamics can be interpreted as the continuous setting for Nesterov's accelerated method with a term that depends on the Hessian of the function. The results can be interpreted as an extension of the restart scheme proposed by Su, Boyd and Candès in 2016. Numerical experiments are displayed, showing the performance of the restart routine and an existence theorem for the solutions of the differential equation.
The second part focuses on the inclusion of inertia on Krasnoselskii-Mann iterations, governed by a family of quasi-nonexpansive operators defined over a Hilbert space. Krasnoselskii-Mann iterations can be seen as a relaxed setting for Banach-Picard iterations, and under standard hypotheses, they converge towards a common fixed point of the family of operators. Fixed point iterations play a central role on defining optimization algorithms for a wide range of problems. In particular, results on the weak and strong convergence of the iterations are provided, depending on the hypotheses assumed over the family of operators. Numerical illustrations are displayed for two new inertial algorithms, whose performance is superior to that of their non-inertial counterparts.
The first part contains a result on the convergence of a restarting scheme for a convex function through the solution trajectories of a differential equation with curvature-defined damping. This particular dynamics can be interpreted as the continuous setting for Nesterov's accelerated method with a term that depends on the Hessian of the function. The results can be interpreted as an extension of the restart scheme proposed by Su, Boyd and Candès in 2016. Numerical experiments are displayed, showing the performance of the restart routine and an existence theorem for the solutions of the differential equation.
The second part focuses on the inclusion of inertia on Krasnoselskii-Mann iterations, governed by a family of quasi-nonexpansive operators defined over a Hilbert space. Krasnoselskii-Mann iterations can be seen as a relaxed setting for Banach-Picard iterations, and under standard hypotheses, they converge towards a common fixed point of the family of operators. Fixed point iterations play a central role on defining optimization algorithms for a wide range of problems. In particular, results on the weak and strong convergence of the iterations are provided, depending on the hypotheses assumed over the family of operators. Numerical illustrations are displayed for two new inertial algorithms, whose performance is superior to that of their non-inertial counterparts.
Original language | English |
---|---|
Qualification | Doctor of Philosophy |
Awarding Institution |
|
Supervisors/Advisors |
|
Award date | 12-Dec-2023 |
Place of Publication | [Groningen] |
Publisher | |
DOIs | |
Publication status | Published - 2023 |