Convex Optimization for Machine Learning by Eric Moulines09 Jan 2020, by Events in
28 – 31 January 2020
The Heilbronn Institute, Fry Building, University of Bristol. UK
Eric Moulines, École Polytechnique in Paris, will be visiting the Heilbronn Institute of Mathematical Research as a Data Science Visitor and will be delivering a series of lectures on Convex Optimization for Machine Learning.
Convex Optimization for Machine Learning
Eric Moulines, École Polytechnique, Paris
The purpose of this course is to give an introduction to convex optimization and its applications in statistical learning.
In the first part of the course, I will recall the importance of convex optimisation in statistical learning. I will briefly introduce some useful results of convex analysis. I will then analyse gradient descent algorithms for strongly convex and then convex smooth functions. I will take this opportunity to establish some results on complexity lower bounds for such problems. I will show that the gradient descent algorithm is suboptimal and does not reach the optimal possible speed of convergence. I will the present a strategy to accelerate gradient descent algorithms in order to obtain optimal speeds.
In the second part of the course, I will focus on non smooth optimisation problems. I we will introduce the proximal operator of which I will establish some essential properties. I will then study the proximal gradient algorithms and their accelerated versions.
In a third part, I will look at stochastic versions of these algorithms.