View on GitHub

Statistical Physics For Optimization and Learning

A Set of Lectures given at EPFL in 2021 by Lenka Zdeborova and Florent Krzakala



Main lecturers: Pr. Florent Krzakala, head of IdePHICS lab and Pr. Lenka Zdeborova, head of SPOC lab

Teaching assitants: Bruno Loureiro and Luca Saglietti

Main topics

Interest in the methods and concepts of statistical physics is rapidly growing in fields as diverse as theoretical computer science, probability, machine learning, discrete mathematics, optimization and compressed sensing. This course will cover this rich and active interdisciplinary research landscape.

More specifically, we will review the statistical physics approach to problems ranging from graph theory (percolation, community detection) to discrete optimization and constraint satisfaction (satisfiability, coloring, bisection) and to inference and learning problems (learning in neural networks, clustering of data and of networks, compressed sensing or sparse linear regression, low-rank matrix and tensor factorization, etc.).

We will expose theoretical methods of analysis (replica, cavity, …) algorithms (message passing, spectral methods, …), discuss concrete applications, highlight rigorous justifications as well as present the connection to the physics of glassy and disordered systems.

The course is designed to be accessible to graduate students and researchers of all natural science and engineering disciplines with a basic knowledge of probability and analysis. Advanced training in any of the above fields is not requisite.

References: Information, Physics and Computation (Oxford Graduate Texts), 2009, M. Mézard, A. Montanari

Statistical Physics of inference: Thresholds and algorithms, Advances in Physics 65, 5 2016, L. Zdeborová & F. Krzakala

Times and Place

The lecture will take place on zoom: click here Recorded lectures will be avalible on SWITCHtube click here

Main lectures will on Friday, from 10:00-12:00 and exercices will be on Thursday, 10:00-12:00

Discussion will be on a slack channel (ask for being invited)

Detailed plan:

Lecture notes: Chapter 1-6

Lecture 1, 26/2/2021 “Curie-Weiss model”

Homework 1, due on Wed 10.3 (no delay allowed): Exercice 1.4 (Metropolis-Hastings algorithm) & Exercise 1.5:(Glauber Algorithm). Please upload your homework on Moodle.

Corrections for Exercises 1.1-1.3

Lecture 2, 05/3/2021 “Your first replica computations”

Homework 2, due on Wed 17.3 (no delay allowed): Exercise 2.3 (Mean-field algorithm and state evolution) & Exercise 3.1:(Wishart Matrix). Please upload your homework on Moodle.

Corrections for Exercises 1.4-2.2 and the code

Lecture 3, 12/3/2021 “Belief Propagation”

Homework 3, due on Wed 24.3 (no delay allowed): Exercise 4.1 (Representing problems by graphical models) & Exercise 4.2:(Bethe free entropy). Please upload your homework on Moodle.

Corrections for Exercises 2.3-3.1 and the code

Lecture 4, 19/3/2021 “BP for Graph coloring and Potts models”

Homework 4, due on Wed 31.3: Exercise 5.1 (Bethe free entropy for coloring), Exercise 5.3 (fully connected limit) & Exercise 5.3 (BP for matching). Please upload your homework on Moodle.

Download here the notebook with the plots from Chapter 5.

Corrections for Exercises 4.1, 4.2.

Lecture 5, 26/3/2021 “Denoising and Optimal Bayesian Inference”

Corrections for Exercises 5.1-5.3.

Lecture 6, 1/4/2021 “Rank-one Matrix factorization”

Homework 5, due on Wed 14.4 (no delay allowed): Exercise 7.1 (THe BBP transition) & Exercise 7.2:(More phase transitions). Please upload your homework on Moodle.