The Slingshot Effect: A Late-Stage Optimization Anomaly in Adam-Family of Optimization Methods
Machine Learning Research at Apple
APRIL 21, 2024
Adaptive gradient methods, notably Adam, have become indispensable for optimizing neural networks, particularly in conjunction with Transformers. In this paper, we present a novel optimization anomaly called the Slingshot Effect, which manifests during extremely late stages of training. We identify a distinctive characteristic of this phenomenon through cyclic phase transitions between stable and unstable training regimes, as evidenced by the cyclic behavior of the norm of the last layer’s weigh
147 147
Let's personalize your content