MACHINE LEARNING

APPLICATION OF SUPERVISED LEARNING

MACHINE LEARNING PIPELINE

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
When updating your weights using the loss function, what dictates how much change the weights should have?
A
Batch size
B
Learning rate
C
Initial weights
D
Bias term
Explanation: 

Detailed explanation-1: -The amount that the weights are updated during training is referred to as the step size or the “learning rate.” Specifically, the learning rate is a configurable hyperparameter used in the training of neural networks that has a small positive value, often in the range between 0.0 and 1.0.

Detailed explanation-2: -Learning rate gradually increases from 0.00002 to 0.002 at epoch 100, and linearly decreases until reaching 0.0002 at epoch 800.

Detailed explanation-3: -To update the weights, the gradients are multiplied by the learning rate (alpha), and the new weights are calculated by the following formula: Equation 1. Weights update formula for gradient descent. W = Weights, alpha = Learning rate, J = Cost.

There is 1 question to complete.