Latest


2023.12.12: Lecture notes online.
2023.12.04: Board for course 4+notebook 2 online.
2023.11.20: Board for course 3 online.
2023.11.16: Board for course 2+updated notebook 1.
2023.11.09: Board for course 1 online.
2023.11.08: Notebook for the first session online.
2023.10.25: Course webpage online.

Lecturer

Clément Royer
clement.royer@lamsade.dauphine.fr

Back to the general teaching page

Regularized, large-scale and distributed optimization

Optimization for Machine Learning

M2 IASD/MASH, Université Paris Dauphine-PSL, 2023-2024


Program

     In these lectures, we introduce several concepts that can be combined with the methods seen in the rest of the course. The first two sessions describe regularization techniques, with a focus on sparsity-inducing regularization. The last two sessions are concerned with optimization methods tailored to problems with a very large number of variables, and present algorithms that are relevant to distributed data environments.

Schedule

     Lecture 1/4 (Nov. 9) Proximal methods.
     Lecture 2/4 (Nov. 16) Sparse optimization and LASSO.
     Lecture 3/4 (Nov. 20) Coordinate descent methods.
     Lecture 4/4 (Dec. 4) Distributed optimization.

Course material

     Virtual board for lecture 1 (proximal methods) PDF
     Virtual board for lecture 2 (sparsity) PDF
     Python notebook on regularization [Sources]

     Virtual board for lecture 3 (coordinate descent) PDF
     Virtual board for lecture 4 (distributed optimization) PDF
     Python notebook on coordinate descent and ADMM [Sources]

     Lecture notes PDF


Materials on this page are available under Creative Commons CC BY-NC 4.0 license.
La version française de cette page se trouve ici.