MACHINE LEARNING

APPLICATION OF SUPERVISED LEARNING

DEEP LEARNING

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
Which are the correct statements about Momentum Optimizer?
A
Enhances the stability of the gradient correction direction and reduces mutations
B
Requires less experiments to determine the appropriate value for the learning rate
C
Too slow to get convergence
D
A small ball with inertia is more likely to roll over some narrow local extrema.
Explanation: 

Detailed explanation-1: -Momentum is an extension to the gradient descent optimization algorithm that allows the search to build inertia in a direction in the search space and overcome the oscillations of noisy gradients and coast across flat spots of the search space.

Detailed explanation-2: -Step 1: Convolution. Step 1b: ReLU Layer. Step 2: Pooling. Step 3: Flattening.

Detailed explanation-3: -Adam optimizer The idea behind Adam optimizer is to utilize the momentum concept from “SGD with momentum” and adaptive learning rate from “Ada delta”. Using the above equation, now the weight and bias updation formula looks like: Advantage of using ADAM optimizer: Straightforward to implement.

There is 1 question to complete.