Initializing neural networks

https://www.deeplearning.ai/ai-notes/initialization/index.html?utm_source=deeplearningai&utm_medium=institutions&utm_campaign=TwitterApril2019

Initialization – deeplearning.ai
II The problem of exploding or vanishing gradients. Consider this 9-layer neural network. At every iteration of the optimization loop (forward, cost, backward, update), we observe that backpropagated gradients are either amplified or minimized as you move from the output layer towards the input layer.
www.deeplearning.ai

Publié

dans

par

Étiquettes :