View on GitHub

Statistical Physics For Optimization and Learning

A Set of Lectures given at EPFL in 2021 by Lenka Zdeborova and Florent Krzakala

epfl

Lecturers

Main lecturers: Pr. Florent Krzakala, head of IdePHICS lab and Pr. Lenka Zdeborova, head of SPOC lab

Teaching assitants: Bruno Loureiro and Luca Saglietti

Main topics

Interest in the methods and concepts of statistical physics is rapidly growing in fields as diverse as theoretical computer science, probability, machine learning, discrete mathematics, optimization and compressed sensing. This course will cover this rich and active interdisciplinary research landscape.

More specifically, we will review the statistical physics approach to problems ranging from graph theory (percolation, community detection) to discrete optimization and constraint satisfaction (satisfiability, coloring, bisection) and to inference and learning problems (learning in neural networks, clustering of data and of networks, compressed sensing or sparse linear regression, low-rank matrix and tensor factorization, etc.).

We will expose theoretical methods of analysis (replica, cavity, …) algorithms (message passing, spectral methods, …), discuss concrete applications, highlight rigorous justifications as well as present the connection to the physics of glassy and disordered systems.

The course is designed to be accessible to graduate students and researchers of all natural science and engineering disciplines with a basic knowledge of probability and analysis. Advanced training in any of the above fields is not requisite.

References: Information, Physics and Computation (Oxford Graduate Texts), 2009, M. Mézard, A. Montanari

Statistical Physics of inference: Thresholds and algorithms, Advances in Physics 65, 5 2016, L. Zdeborová & F. Krzakala

Times and Place

The lecture will take place on zoom: click here Recorded lectures will be avalible on SWITCHtube click here

Main lectures will on Friday, from 10:00-12:00 and exercices will be on Thursday, 10:00-12:00

Discussion will be on a slack channel (ask for being invited)

Detailed plan:

Lecture notes: Chapter 1-13

Lecture 1, 26/2/2021 “Curie-Weiss model”

Homework 1, due on Wed 10.3 (no delay allowed): Exercice 1.4 (Metropolis-Hastings algorithm) & Exercise 1.5:(Glauber Algorithm). Please upload your homework on Moodle.

Lecture 2, 05/3/2021 “Your first replica computations”

Homework 2, due on Wed 17.3 (no delay allowed): Exercise 2.3 (Mean-field algorithm and state evolution) & Exercise 3.1:(Wishart Matrix). Please upload your homework on Moodle.

Lecture 3, 12/3/2021 “Belief Propagation”

Homework 3, due on Wed 24.3 (no delay allowed): Exercise 4.1 (Representing problems by graphical models) & Exercise 4.2:(Bethe free entropy). Please upload your homework on Moodle.

Lecture 4, 19/3/2021 “BP for Graph coloring and Potts models”

Homework 4, due on Wed 31.3: Exercise 5.1 (Bethe free entropy for coloring), Exercise 5.3 (fully connected limit) & Exercise 5.3 (BP for matching). Please upload your homework on Moodle.

Download here the notebook with the plots from Chapter 5.

Lecture 5, 26/3/2021 “Denoising and Optimal Bayesian Inference”

Lecture 6, 1/4/2021 “Rank-one Matrix factorization”

Homework 5, due on Wed 14.4 (no delay allowed): Exercise 7.1 (THe BBP transition) & Exercise 7.2:(More phase transitions). Please upload your homework on Moodle.

Lecture 7, 16/4/2021 “Cavity method and AMP”

Homework 6, due on Wed 21.4 (no delay allowed): Exercise 8.1. (Coding AMP for exos 7.1 and 7.2) Please upload your homework on Moodle.

Lecture 8, 23/4/2021 “Stochastic Block Model and Community detection”

Homework 7, due on Wed 28.4 (no delay allowed): Exercise 9.1. and 9.2 Please upload your homework on Moodle.

Lecture 9, 30/4/2021 “Graph Coloring II”

Lecture 10, 7/5/2021 “Graph Coloring III”

Homework 8, due on Wed 19.5 (no delay allowed): Exercise 11.1 (Random subcube model). Please upload your homework on Moodle.

Lecture 11, 15/5/2021 “Replica Symmetry Breaking and the Random Energy Model”

Homework 9, due on Wed 26.5 (no delay allowed): Exercise 12.2 (Second moment of the participation ratio) (after you look at section 12.2.4 of the notes) OR Exercise 12.1 (REM as a p-spin model). Please upload your homework on Moodle.

Lecture 12, 21/5/2021 “Linear problems, LASSO and Approximate Message Passing”

Homework 10, due on Wed 2.6 (no delay allowed): Exercise 13.1 (coding AMP for compressed sensing). Please upload your homework on Moodle.