APPLICATION OF SUPERVISED LEARNING
DEEP LEARNING
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
What is CORRECT about gradient descent?
|
It returns the optimum parameters that minimize the cost function
|
|
It returns the optimum parameters that maximize the cost function
|
|
It returns the optimum parameters that minimize the loss function
|
|
It returns the optimum parameters that maximize the loss function
|
Explanation:
Detailed explanation-1: -Gradient Descent is the most common optimization algorithm in machine learning and deep learning. It is a first-order optimization algorithm. This means it only takes into account the first derivative when performing the updates on the parameters.
Detailed explanation-2: -Each step of gradient descent will always decrease the value of the function.
Detailed explanation-3: -The gradient always points in the direction of steepest increase in the loss function. The gradient descent algorithm takes a step in the direction of the negative gradient in order to reduce loss as quickly as possible.
There is 1 question to complete.