The Statistical Physics of Learning Revisited: Typical Learning Curves in Model Scenarios

Michael Biehl*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademic


The exchange of ideas between computer science and statistical physics has advanced the understanding of machine learning and inference significantly. This interdisciplinary approach is currently regaining momentum due to the revived interest in neural networks and deep learning. Methods borrowed from statistical mechanics complement other approaches to the theory of computational and statistical learning. In this brief review, we outline and illustrate some of the basic concepts. We exemplify the role of the statistical physics approach in terms of a particularly important contribution: the computation of typical learning curves in student teacher scenarios of supervised learning. Two, by now classical examples from the literature illustrate the approach: the learning of a linearly separable rule by a perceptron with continuous and with discrete weights, respectively. We address these prototypical problems in terms of the simplifying limit of stochastic training at high formal temperature and obtain the corresponding learning curves.

Original languageEnglish
Title of host publicationBrain-Inspired Computing
Subtitle of host publication4th International Workshop, BrainComp 2019, Revised Selected Papers
EditorsKatrin Amunts, Lucio Grandinetti, Thomas Lippert, Nicolai Petkov
Number of pages15
ISBN (Electronic)978-3-030-82427-3
ISBN (Print)978-3-030-82426-6
Publication statusPublished - Jul-2021
Event4th International Workshop on Brain-Inspired Computing, BrainComp 2019 - Cetraro, Italy
Duration: 15-Jul-201919-Jul-2019

Publication series

NameLecture Notes in Computer Science
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference4th International Workshop on Brain-Inspired Computing, BrainComp 2019

Cite this