MACHINE LEARNING

APPLICATION OF SUPERVISED LEARNING

DEEP LEARNING

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
If we want our neural network to converge faster which of the following should be used
A
Batch gradient descent
B
Stochastic gradient descent
C
Mini batch gradient descent
D
All of the above
Explanation: 

Detailed explanation-1: -Stochastic gradient descent (SGD or “on-line") typically reaches convergence much faster than batch (or “standard") gradient descent since it updates weight more frequently.

Detailed explanation-2: -It is easy to fit in memory as only one data point needs to be processed at a time. It updates weights more regularly as compared to batch gradient descent and hence it converges faster. It is computationally less expensive than batch gradient descent. More items •18-Aug-2020

There is 1 question to complete.