I am a research Scientist ar Samsung SAIL Montreal, in the MILA. The lab is headed by Simon Lacoste-Julien. I focus my research on optimization for machine learning, especially accelerated and adaptive methods. I work closely with researchers from Montreal and McGill Universities.
My main research interest is convex optimization, but I am also interested in numerical analysis. Here is a (non-exhaustive) selection of research topics:
- Deterministic and Stochastic Optimization,
- Generic Acceleration Methods,
- Multisecant Quasi-Newton,
- Algorithms for Variational Inequalities
- Integration Algorithms for ODE (and their links with optimization).
Post-doc (Princeton CS Department)
I worked one year at Princeton University in the CS department with Elad Hazan and Sanjeev Arora. I collaborated with other researchers in ORFE (Samy Jelassi and Thomas Pumir), as well as with Nicolas Boumal from the PACM. I worked on multisecant Quasi-Newton methods and stochastic algorithms for game theory.
PhD Studies (INRIA/ENS)
I am a former PhD Student (from 2015 to 2018) of Alexandre d’Aspremont and Francis Bach. I worked in the Sierra Team, part of the Computer Science Department of École Normale Supérieure Ulm. I also earned the best thesis in datascience award from PSD University
My thesis, entitles Acceleration in Optimisation, focuses on links between acceleration and numerical analysis. You can find the manuscript here. The mains contribution is the design of the algorithm Regularized Nonlinear Acceleration - a generic way to improve the rate of convergence of many optimization methods.
Master Studies (UCL)
I did my master thesis in optimization with my advisor Yurii Nesterov. I graduated from École Polytechnique de Louvain in 2015 and got a master degree in Mathematical Engineering. At UCL I also collaborated with Leopold Cambier and Anthony Papavasiliou for the creation of the FAST toolbox. In the same time I worked with Raphael Jungers and Julien Hendrickx in the context of the JSR Toolbox.