MACHINE LEARNING

APPLICATION OF SUPERVISED LEARNING

DEEP LEARNING

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
____ occurs when the gradients become very small and tend towards zero.
A
Exploding Gradients
B
Vanishing Gradients
C
Long Short Term Memory Networks
D
Gated Recurrent Unit Networks
Explanation: 

Detailed explanation-1: -As the backpropagation algorithm advances downwards(or backward) from the output layer towards the input layer, the gradients often get smaller and smaller and approach zero which eventually leaves the weights of the initial or lower layers nearly unchanged.

Detailed explanation-2: -What is Exploding Gradients? Exploding gradient occurs when the derivatives or slope will get larger and larger as we go backward with every layer during backpropagation. This situation is the exact opposite of the vanishing gradients. This problem happens because of weights, not because of the activation function.

Detailed explanation-3: -Loss=0 does not mean gradient=zero, but the gradient is zero if a variable has no influence on the outcome of your loss function. If you mask your loss values with zero values, then they block any influence of your model variables on the outcome of your loss and thus the gradient is zero.

There is 1 question to complete.