MACHINE LEARNING

APPLICATION OF SUPERVISED LEARNING

DEEP LEARNING

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
What are the commonly used activation functions?
A
sigmod
B
tanh
C
relu
D
danish
Explanation: 

Detailed explanation-1: -Apart from Leaky ReLU, there are a few other variants of ReLU, the two most popular are – Parameterised ReLU function and Exponential ReLU.

Detailed explanation-2: -A rectified linear unit (ReLU) is an activation function that introduces the property of non-linearity to a deep learning model and solves the vanishing gradients issue. “It interprets the positive part of its argument. It is one of the most popular activation functions in deep learning.

Detailed explanation-3: -The ReLU function is the default activation function for hidden layers in modern MLP and CNN neural network models. We do not usually use the ReLU function in the hidden layers of RNN models. Instead, we use the sigmoid or tanh function there. We never use the ReLU function in the output layer.

Detailed explanation-4: -Examples of ReLu For example, when x is equal to-5, the output of f(-5) is 0. By contrast, the output of f(0) is 0 because the input is greater or equal to 0. Further, the result of f(5) is 5 because the input is greater than zero.

There is 1 question to complete.