How to Avoid Exploding Gradients With Gradient Clipping - MachineLearningMastery.com

Training a neural network can become unstable given the choice of error function, learning rate, or even the scale of the target variable. Large updates to weights during training can cause a numer...

By · · 1 min read
How to Avoid Exploding Gradients With Gradient Clipping - MachineLearningMastery.com

Source: MachineLearningMastery.com

Training a neural network can become unstable given the choice of error function, learning rate, or even the scale of the target variable. Large updates to weights during training can cause a numerical overflow or underflow often referred to as “exploding gradients.” The problem of exploding gradients is more common with recurrent neural networks, such […]