MACHINE LEARNING

APPLICATION OF SUPERVISED LEARNING

DEEP LEARNING

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
____ is used to find local minima of the cost function
A
stochastic gradient descent
B
gradient descent
C
linear regression
D
logistic regression
Explanation: 

Detailed explanation-1: -Gradient descent is an efficient optimization algorithm that attempts to find a local or global minimum of the cost function. A local minimum is a point where our function is lower than all neighboring points. It is not possible to decrease the value of the cost function by making infinitesimal steps.

Detailed explanation-2: -Gradient descent is used to minimize a cost function J(W) parameterized by a model parameters W. The gradient (or derivative) tells us the incline or slope of the cost function. Hence, to minimize the cost function, we move in the direction opposite to the gradient.

Detailed explanation-3: -Gradient Descent is an iterative process that finds the minima of a function. This is an optimisation algorithm that finds the parameters or coefficients of a function where the function has a minimum value. Although this function does not always guarantee to find a global minimum and can get stuck at a local minimum.

There is 1 question to complete.