picture

I joined the Institute of Mathematics at EPFL on July 1st, 2020, where I run the chair of continuous optimization OPTIM.

My group works on optimization, statistical estimation and numerical analysis. Much of what we do is related to nonconvex optimization and optimization on manifolds. For the latter, we develop a toolbox called Manopt and I wrote a book.

My 2021 ERC Starting Grant project GEOSYM runs from fall 2022 to 2027. Our goal is to harness geometry, symmetry and statistics in optimization to tackle nonconvexity. This is funded by SERI.

As of Jan. 1st, 2022, I am an associate editor for Mathematical Programming.

The blog of my group is Race to the bottom.

Research topics:

  • Nonconvex optimization (complexity, structure, global optimality)
  • Optimization on Riemannian manifolds
  • Semidefinite programs and relaxations in low-rank form
  • Low-rank optimization
  • Statistical estimation, bounds, notably under group actions
  • Synchronization (estimation from pairwise information)
  • Single particle reconstruction in cryo-electron microscopy
  • curve fitting on manifolds


Positions prior to EPFL:

  • Instructor then assistant professor in the mathematics department at Princeton University. There, I also interacted closely with PACM (the program in applied and computational mathematics), especially the group of Amit Singer. (Feb. 2016–June 2020);
  • Postdoc at Inria in Paris, affiliated with the computer science department of the Ecole Normale Supérieure with Alexandre d'Aspremont in the SIERRA team, working on topics at the intersection of optimization and statistics. (Oct. 2014–Jan. 2016);
  • Ph.D. student working with Pierre-Antoine Absil and Vincent Blondel at UCLouvain, in the department of mathematical engineering. My dissertation is about optimization and estimation on manifolds. (Oct. 2010–Sep. 2014).

Group picture of OPTIM chair at EPFL in October 2023

Theses

 

Reports

 

Journal papers

 

Conference papers

 

Talks

 

Posters

 
Nicolas Boumal
EPFL SB MATH
MA C2 627 (Bâtiment MA)
Station 8
CH-1015 Lausanne
Switzerland

Office: MA C2 627, 2nd floor
If you don't have a badge, use MA building entrance near Satellite (two levels above ground floor) and walk all the way to the end of the corridor.

E-mail: nicolas.boumal@epfl.ch

Miscellaneous facts


My Erdös number is 3, courtesy of two co-authors (Vincent Blondel and Afonso Bandeira).

Research will get you places! It got me in: Palo Alto, Boston, Princeton, London, Prague, Cannes, Lisbon, Milan, Dagstuhl, Granada, Sierra Nevada, Valencia, Berlin, Les Houches, Costa da Caparica, Paris, Florence, San Diego, Bordeaux, Montréal, Bonn, Pittsburgh, Oxford, Geneva, New York City, Barcelona, Chicago, Vancouver, Ames, Minneapolis, Washington DC, Stockholm, Lausanne, Oaxaca, Villars-sur-Ollon, Pasadena, Interlaken, Chexbres, Zurich, Seattle, Lugano, Biel... and various places in Belgium (Louvain-la-Neuve, Leuven, Liège, La Roche, Mons, Knokke, Daverdisse, Spa, Namur, Bruxelles...).

Teaching at EPFL

  • Algèbre linéaire I (MATH-111(e)), Fall 2021, 2022, 2023
  • Continuous optimization (MATH-329), Spring 2021, 2022, Fall 2022, 2023, lecture notes
  • Optimization on manifolds (MATH-512), Spring 2021, 2023, book + videos
  • Mathematical foundations of neural networks (graduate seminar, MATH-631), Fall 2020

Teaching at Princeton University

  • Linear algebra with applications (MAT202), Spring 2016, 2017
  • Numerical methods (MAT321), Fall 2016, 2017, 2018, 2019, lecture notes
  • Junior seminar (optimization on manifolds, MAT982), Spring 2018
  • Junior seminar (math of data science through cryo-electron microscopy, MAT982), Fall 2018
  • Junior seminar (math of data science, MAT982), Fall 2019
  • Optimization on smooth manifolds (graduate course, MAT588), Spring 2019, 2020, book + videos


picture

An introduction to optimization on smooth manifolds

This book about optimization on smooth manifolds has no prerequisites in geometry or optimization. Chapters 3 and 5 can serve as a standalone introduction to differential and Riemannian geometry, focused on embedded submanifolds of linear spaces, with full proofs and an emphasis on computability. It then builds from there to equip the reader to take on modern research challenges.



You may also be interested in the Manopt toolboxes (Matlab, Python, Julia).

Table of contents (expand/collapse)
  • Preface
  • 1. Introduction
  • 2. Simple examples
    • 2.1 Sensor network localization from directions: an affine subspace
    • 2.2 Single extreme eigenvalue or singular value: spheres
    • 2.3 Dictionary learning: products of spheres
    • 2.4 Principal component analysis: Stiefel and Grassmann
    • 2.5 Synchronization of rotations: special orthogonal group
    • 2.6 Low-rank matrix completion: fixed-rank manifold
    • 2.7 Gaussian mixture models: positive definite matrices
    • 2.8 Smooth semidefinite programs
  • 3. Embedded geometry: first order
    • 3.1 Reminders of Euclidean space
    • 3.2 Embedded submanifolds of a linear space
    • 3.3 Smooth maps on embedded submanifolds
    • 3.4 The differential of a smooth map
    • 3.5 Vector fields and the tangent bundle
    • 3.6 Moving on a manifold: retractions
    • 3.7 Riemannian manifolds and submanifolds
    • 3.8 Riemannian gradients
    • 3.9 Local frames*
    • 3.10 Notes and references
  • 4. First-order optimization algorithms
    • 4.1 A first-order Taylor expansion on curves
    • 4.2 First-order optimality conditions
    • 4.3 Riemannian gradient descent
    • 4.4 Regularity conditions and iteration complexity
    • 4.5 Backtracking line-search
    • 4.6 Local convergence*
    • 4.7 Computing gradients*
    • 4.8 Numerically checking a gradient*
    • 4.9 Notes and references
  • 5. Embedded geometry: second order
    • 5.1 The case for another derivative of vector fields
    • 5.2 Another look at differentials of vector fields in linear spaces
    • 5.3 Differentiating vector fields on manifolds: connections
    • 5.4 Riemannian connections
    • 5.5 Riemannian Hessians
    • 5.6 Connections as pointwise derivatives*
    • 5.7 Differentiating vector fields on curves
    • 5.8 Acceleration and geodesics
    • 5.9 A second-order Taylor expansion on curves
    • 5.10 Second-order retractions
    • 5.11 Special case: Riemannian submanifolds*
    • 5.12 Special case: metric projection retractions*
    • 5.13 Notes and references
  • 6. Second-order optimization algorithms
    • 6.1 Second-order optimality conditions
    • 6.2 Riemannian Newton's method
    • 6.3 Computing Newton steps: conjugate gradients
    • 6.4 Riemannian trust regions
    • 6.5 The trust-region subproblem: truncated CG
    • 6.6 Local convergence of RTR with tCG*
    • 6.7 Simplified assumptions for RTR with tCG*
    • 6.8 Numerically checking a Hessian*
    • 6.9 Notes and references
  • 7. Embedded submanifolds: examples
    • 7.1 Euclidean spaces as manifolds
    • 7.2 The unit sphere in a Euclidean space
    • 7.3 The Stiefel manifold: orthonormal matrices
    • 7.4 The orthogonal group and rotation matrices
    • 7.5 Fixed-rank matrices
    • 7.6 The hyperboloid model
    • 7.7 Manifolds defined by $h(x) = 0$
    • 7.8 Notes and references
  • 8. General manifolds
    • 8.1 A permissive definition
    • 8.2 The atlas topology, and a final definition
    • 8.3 Embedded submanifolds are manifolds
    • 8.4 Tangent vectors and tangent spaces
    • 8.5 Differentials of smooth maps
    • 8.6 Tangent bundles and vector fields
    • 8.7 Retractions and velocity of a curve
    • 8.8 Coordinate vector fields as local frames
    • 8.9 Riemannian metrics and gradients
    • 8.10 Lie brackets as vector fields
    • 8.11 Riemannian connections and Hessians
    • 8.12 Covariant derivatives and geodesics
    • 8.13 Taylor expansions and second-order retractions
    • 8.14 Submanifolds embedded in manifolds
    • 8.15 Notes and references
  • 9. Quotient manifolds
    • 9.1 A definition and a few facts
    • 9.2 Quotient manifolds through group actions
    • 9.3 Smooth maps to and from quotient manifolds
    • 9.4 Tangent, vertical and horizontal spaces
    • 9.5 Vector fields
    • 9.6 Retractions
    • 9.7 Riemannian quotient manifolds
    • 9.8 Gradients
    • 9.9 A word about Riemannian gradient descent
    • 9.10 Connections
    • 9.11 Hessians
    • 9.12 A word about Riemannian Newton's method
    • 9.13 Total space embedded in a linear space
    • 9.14 Horizontal curves and covariant derivatives
    • 9.15 Acceleration, geodesics and second-order retractions
    • 9.16 Grassmann manifold: summary*
    • 9.17 Notes and references
  • 10. Additional tools
    • 10.1 Distance, geodesics and completeness
    • 10.2 Exponential and logarithmic maps
    • 10.3 Parallel transport
    • 10.4 Lipschitz conditions and Taylor expansions
    • 10.5 Transporters
    • 10.6 Finite difference approximation of the Hessian
    • 10.7 Tensor fields and their covariant differentiation
    • 10.8 Notes and references
  • 11. Geodesic convexity
    • 11.1 Convex sets and functions in linear spaces
    • 11.2 Geodesically convex sets and functions
    • 11.3 Alternative definitions of geodesically convex sets*
    • 11.4 Differentiable geodesically convex functions
    • 11.5 Geodesic strong convexity and Lipschitz continuous gradients
    • 11.6 Example: positive reals and geometric programming
    • 11.7 Example: positive definite matrices
    • 11.8 Notes and references
  • Bibliography