MACHINE LEARNING

APPLICATION OF SUPERVISED LEARNING

SUPERVISED AND UNSUPERVISED LEARNING

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
Which of the statement about gradient descent is true?
A
We find local maxima in gradient descent
B
It is an optimization algorithm
C
Either A or B
D
None of the above
Explanation: 

Detailed explanation-1: -Gradient descent guarantees the best possible answer in one run of the algorithm It is an optimization algorithm that seeks to find the best parameters of a function Gradient descent works best for a convex optimization function Gradient descent requires the optimization function to be differentiable.

Detailed explanation-2: -Gradient Descent is the most common optimization algorithm in machine learning and deep learning. It is a first-order optimization algorithm. This means it only takes into account the first derivative when performing the updates on the parameters.

Detailed explanation-3: -Gradient descent is an optimization algorithm which is commonly-used to train machine learning models and neural networks. Training data helps these models learn over time, and the cost function within gradient descent specifically acts as a barometer, gauging its accuracy with each iteration of parameter updates.

Detailed explanation-4: -Gradient descent (GD) is an iterative first-order optimisation algorithm used to find a local minimum/maximum of a given function. This method is commonly used in machine learning (ML) and deep learning(DL) to minimise a cost/loss function (e.g. in a linear regression).

There is 1 question to complete.