MACHINE LEARNING

APPLICATION OF SUPERVISED LEARNING

DEEP LEARNING

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
Batch normalization helps to prevent ____
A
activation functions to become too high or low
B
the training speed to become too slow
C
Both A and B
D
None
Explanation: 

Detailed explanation-1: -These parameters are used for re-scaling () and shifting() of the vector containing values from the previous operations. These two are learnable parameters, during the training neural network ensures the optimal values of and are used. That will enable the accurate normalization of each batch.

Detailed explanation-2: -Its job is to take the outputs from the first hidden layer and normalize them before passing them on as the input of the next hidden layer. Just like the parameters (eg. weights, bias) of any network layer, a Batch Norm layer also has parameters of its own: Two learnable parameters called beta and gamma.

Detailed explanation-3: -Using batch normalization allows us to use much higher learning rates, which further increases the speed at which networks train. Makes weights easier to initialize-Weight initialization can be difficult, and it’s even more difficult when creating deeper networks.

There is 1 question to complete.