PhD course
This is the web page for a set lectures on statistical physics and machine learning, to be given in the spring of 2019.
First note: the course is given "as is". Students who want to take the course for credits should agree with their supervisor how many credits the course is worth, and what is required as to documentation, examination, etc.
Second note: the course is for convenience announced through a web page under KTH School of Computer Science and Communication (CSC), which is no longer a separate organizational entity at KTH. A link to "Code of honor" of CSC is nevertheless maintained, as that information is of general validity.
Schedule
The schedule has so far been decided until the end of February. Potential later lectures will be scheduled later.
Unless indicated otherwise the lectures will be held in room 112:028. The entrance door to the building from the
side of Roslagstullsbacken is locked, so please be on time.
Second lecture | Wednesday | February 20 | 10.15-12.00 | |
Third lecture | Friday | February 22 | 10.15-12.00 | |
Fourth lecture | Monday | February 25 | 10.15-12.00 | |
Fifth lecture | Wednesday | February 27 | 10.15-12.00 | |
Sixth lecture | Monday | March 11 | 10.15-12.00 | |
Seventh lecture | Wednesday | March 13 | 10.15-12.00 | |
Contents
In the last twenty years there have been a great deal of activity on the interface between Statistical mechanics
(particularly disordered systems)
and machine learning.
Several techniques have been invented independently in both fields,
the most well-known example being the cavity method (statistical mechanics side)
or Belief Propagation (machine learning side).
The course aims to present some results from this area, loosely following the
2008 mongraph by Mezard and Montanari. Possible detailed topics
- Cavity method and Belief Propagation.
- Relation between cavity method and Bethe-Peierls approximation to the free energy.
- Satisfiability problems and the cavity method.
- Inverse Ising and inverse Potts problems
- Dynamics, especially dynamical cavity method
- other topics of interest
From the physics point of view this course is a follow-up on the course given in the fall of 2018.
Then the focus was on fully connected mean-field problems with disorder, for which the prototype is the Sherrington-Kirkpatrick model.
This time the focus is on systems naturally described as on locally tree-like graphs, of which the prototype is random graphs at finite
density (the number of links per node is a finite number of order one).
This is basically the setting that cavity method / Belief Propagation was invented for and describes many
interesting systems such as random satisfiability problems.
Literature
The main reference is
A condensed version of one main argument can be found in
Some selected papers on the topic (list may be continuously updated)
- Jonathan S. Yedidia, William T. Freeman, Yair Weiss
Understanding Belief Propagation and its Generalizations (2001)
- Frank R. Kschischang, Brendan J. Frey, Hans-Andrea Loeliger
Factor Graphs and the Sum-Product Algorithm
IEEE Trans. Inf. Th., 47, 498-519 (2001)
- Marc Mezard, Giorgio Parisi, Riccardo Zecchina
Analytic and Algorithmic Solution of Random Satisfiability Problems
Science 297, 812-815 (2002)
- Marc Mezard, Riccardo Zecchina
Random K-satisfiability problem: From an analytic solution to an efficient algorithm
Phys. Rev. E 66, 056126 (2002)
- Alfredo Braunstein, Marc Mezard, Riccardo Zecchina
Survey propagation: An algorithm for satisfiability
Random Structures and Algorithms 27, 201-226 (2005)
- Roberto Mulet, Andrea Pagnani, Martin Weigt, Riccardo Zecchina
Coloring random graphs
Phys. Rev. Lett. 89, 268701 (2002)
- Lenka Zdeborova, Florent Krzakala Phase Transitions in the Coloring of Random Graphs
Phys. Rev. E 76, 031131 (2007)
- Aurelien Decelle, Florent Krzakala, Cristopher Moore, Lenka Zdeborova Asymptotic analysis of the stochastic block model for modular networks and its algorithmic applications
Phys. Rev. E 84, 066106 (2011)
- H. Chau Nguyen, Riccardo Zecchina, Johannes Berg
Inverse statistical problems: from the inverse Ising problem to data science
Advances in Physics, 66 (3), 197-261 (2017)
Information Geometry
In First Lecture this area of statistics came up. Though interesting it is a bit marginal to the course, except perhaps
to the last part on Inverse Ising/Potts problems (inference problems). Here are
some references, including on the relation to mean-field and variational methods:
- Information Geometry on wikipedia.
- Shun-ichi Amari, Differential-geometrical methods in statistics, Lecture Notes in Statistics, Springer-Verlag, Berlin (1985)
- Toshiyuki Tanaka, Information Geometry of Mean-Field Approximation, Neural Computation
12:1951-1968 (2000)
- Shun-ichi Amari, Shiro Ikeda, Hidetoshi Shimokawa,
Information Geometry of alpha-Projection in Mean Field Approximation, in "Advanced Mean Field Methods: Theory and Practice" (edited by M. Opper and D. Saad), The MIT Press (2001)
Contact
Erik Aurell, tel: 790 69 84, e-mail: eaurell@kth.se
|