MACHINE LEARNING

APPLICATION OF SUPERVISED LEARNING

NEURAL NETWORK

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
The most suitable activation function for hidden layer
A
Sigmoid
B
ReLu
C
Softmax
D
tanh
Explanation: 

Detailed explanation-1: -The rectified linear activation function, or ReLU activation function, is perhaps the most common function used for hidden layers. It is common because it is both simple to implement and effective at overcoming the limitations of other previously popular activation functions, such as Sigmoid and Tanh.

Detailed explanation-2: -A rectified linear unit (ReLU) is an activation function that introduces the property of non-linearity to a deep learning model and solves the vanishing gradients issue. “It interprets the positive part of its argument. It is one of the most popular activation functions in deep learning.

Detailed explanation-3: -ReLU (Rectified Linear Unit) Activation Function The ReLU is the most used activation function in the world right now. Since, it is used in almost all the convolutional neural networks or deep learning.

There is 1 question to complete.