Neural scaling laws from large-N field theory: solvable model beyond the ridgeless limit
Neural scaling laws from large-N field theory: solvable model beyond the ridgeless limit
Blog Article
Many machine learning models based on neural networks exhibit scaling laws: their performance scales as power arm cover top laws with respect to the sizes of the model and training data set.We use large- N field theory methods to solve a model recently proposed by Maloney, Roberts and Sully which provides a simplified setting to study neural scaling laws.Our solution extends the result in this latter paper to general nonzero values of the ridge parameter, which are essential to regularize the behavior of the model.In addition to Background obtaining new and more precise scaling laws, we also uncover a duality transformation at the diagrams level which explains the symmetry between model and training data set sizes.The same duality underlies recent efforts to design neural networks to simulate quantum field theories.