By Oliver Y. Feng, University of Cambridge, UK, o.feng@statslab.cam.ac.uk | Ramji Venkataramanan, University of Cambridge, UK, ramji.v@eng.cam.ac.uk | Cynthia Rush, Columbia University, USA, cgr2130@columbia.edu | Richard J. Samworth, University of Cambridge, UK, r.samworth@statslab.cam.ac.uk
Over the last decade or so, Approximate Message Passing (AMP) algorithms have become extremely popular in various structured high-dimensional statistical problems. Although the origins of these techniques can be traced back to notions of belief propagation in the statistical physics literature, our goals in this work are to present the main ideas of AMP from a statistical perspective and to illustrate the power and flexibility of the AMP framework. Along the way, we strengthen and unify many of the results in the existing literature.
Over the last decade, Approximate Message Passing (AMP) algorithms have become extremely popular in various structured high-dimensional statistical problems. Many of the original ideas of AMP were developed in the physics and engineering literature and have recently been extended for use in computer science and machine learning.
In this tutorial the authors give a comprehensive and rigorous introduction to what AMP can offer, as well as to unifying and formalizing the core concepts within the large body of recent work in the area. They lead the reader through the basic concepts of AMP before introducing the concept of low-rank matrix estimation. The authors conclude by covering generalized models. To complete the picture for researchers, proofs, technical remarks and mathematical background are also provided.
This tutorial is an in depth introduction to Approximate Message Passing for students and researchers new to the topic.