PoS - Proceedings of Science
Volume 466 - The 41st International Symposium on Lattice Field Theory (LATTICE2024) - Algorithms and Artificial Intelligence
Random Matrix Theory for Stochastic Gradient Descent
C. Park*, M. Favoni, B. Lucini and G. Aarts
Full text: pdf
Pre-published on: January 16, 2025
Published on:
Abstract
Investigating the dynamics of learning in machine learning algorithms is of paramount importance for understanding how and why an approach may be successful. The tools of physics and statistics provide a robust setting for such investigations. Here, we apply concepts from random matrix theory to describe stochastic weight matrix dynamics, using the framework of Dyson Brownian motion. We derive the linear scaling rule between the learning rate (step size) and the batch size, and identify universal and non-universal aspects of weight matrix dynamics. We test our findings in the (near-)solvable case of the Gaussian Restricted Boltzmann Machine and in a linear one-hidden-layer neural network.
DOI: https://doi.org/10.22323/1.466.0031
How to cite

Metadata are provided both in "article" format (very similar to INSPIRE) as this helps creating very compact bibliographies which can be beneficial to authors and readers, and in "proceeding" format which is more detailed and complete.

Open Access
Creative Commons LicenseCopyright owned by the author(s) under the term of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.