Latest


2025.11.07: Material from session 7.
2025.11.03: Added back notebooks without solutions.
2025.10.29: Exercises V4.
2025.10.24: Corrected typo course project.
2025.10.23: Material from session 6+fix notations exercises 5.x.
2025.10.23: Exercises V3+Notebook session 6+Project.
2025.10.16: Material from session 5.
2025.10.15: Exercices V2.
2025.10.10: Correction board session 3.
2025.10.09: Exercises (V1.1), material from session 4.
2025.10.02: Material from session 3.
2025.09.25: Material from session 2.
2025.09.19: Material from session 1.
2025.09.15: The course webpage for Fall 2025 is online.

Instructor

Clément Royer
clement.royer@lamsade.dauphine.fr

Back to the general teaching page

Optimization for Machine Learning

M2 IASD, Université Paris Dauphine-PSL, 2025-2026


Aim of the course

     Study the main optimization techniques used in machine learning and data science, as well as their underlying principles.

Course project (tentative deadline: December 19, 2025)

     Assignment (Version Oct. 24) PDF

Course material

     Exercises (V4.1, Nov. 5) PDF

Session 1: Basics of optimization

     Intro slides PDF
     Virtual board PDF
     Backup slides (with material not covered in class) PDF

Session 2: Differentiation

     Virtual board PDF
     Notebook (without solutions) ZIP
     Notebook (with solutions) ZIP

Session 3: Gradient methods

     Virtual board (corrected version Oct. 10) PDF
     Notebook (without solutions) ZIP
     Notebook (with solutions) ZIP

Session 4: Nonconvex and nonsmooth optimization

     Virtual board PDF
     Illustration notebook ZIP

Session 5: Proximal methods

     Virtual board PDF
     Illustration notebook ZIP

Session 6: Stochastic gradient

     Virtual board PDF
     Notebook (without solutions) ZIP
     Notebook (with solutions) ZIP

Session 7: Advanced stochastic methods

     Virtual board PDF
     Illustration notebook ZIP


Materials on this page are available under Creative Commons CC BY-NC 4.0 license.
La version française de cette page se trouve ici.