[Back]


Doctor's Theses (authored and supervised):

S. Schechtman:
"Quelques problèmes en optimisation non-convexe et stochastique (Some Problems in Nonconvex Stochastic Optimization)";
Supervisor, Reviewer: W. Hachem, P. Bianchi, A. Daniilidis, J. Bolte; Université Gustave Eiffel, 2021; oral examination: 2021-12-14.



English abstract:
The subject of this thesis is the analysis of several stochastic algorithms
in a nonconvex setting. The aim is to prove and characterize their convergence.
First, we study a smooth optimization problem, analyzing a family of adaptive algorithms
with momentum which includes the widely used ADAM and Nesterovīs
accelerated gradient descent. Convergence and fluctuation of the iterates are established.
A general avoidance of traps result for stochastic algorithms underlined
by a nonautonomous differential equation is presented and applied to establish the
nonconvergence of the iterates to saddle points.
The rest of the manuscript is devoted to the case where the function that we seek
to minimize is nonsmooth. Most of our results in this part apply to functions definable
in an o-minimal structure. Firstly, we analyze the constant step version of
the stochastic subgradient descent (SGD) and show that the iterates converge with
high probability to the set of critical points. Secondly, we show that every critical
point of a generic, definable, locally Lipschitz continuous function is lying on an
active manifold, satisfying a Verdier and an angle condition and is either a local
minimum, an active strict saddle or a sharply repulsive critical point. We show
that, under mild conditions on the perturbation sequence, the SGD escapes active
strict saddles and sharply repulsive critical points. An improvement of the projection
formula for definable functions, giving a Lipschitz-like condition on its Clarke
subgradients is presented and is of independent interest. Finally, we establish an
oscillation phenomena of the iterates of the SGD and its proximal extensions.

Keywords:
stochastic approximation, avoidance of traps, nonsmooth optimization, semialgebraic, o-minimality, stratifications, stochastic subgradient descent, ADAM, Nesterovīs accelerated gradient descent, adaptive algorithms with momentum

Created from the Publication Database of the Vienna University of Technology.