MACHINE LEARNING

APPLICATION OF SUPERVISED LEARNING

ARTIFICIAL INTELLIGENCE

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
What are the commonly used activation functions? (Multiple Choice)
A
sigmoid
B
tanh
C
danish
D
relu
Explanation: 

Detailed explanation-1: -There are perhaps three activation functions you may want to consider for use in hidden layers; they are: Rectified Linear Activation (ReLU) Logistic (Sigmoid) Hyperbolic Tangent (Tanh)

Detailed explanation-2: -The main reason why ReLu is used is because it is simple, fast, and empirically it seems to work well. Empirically, early papers observed that training a deep network with ReLu tended to converge much more quickly and reliably than training a deep network with sigmoid activation.

Detailed explanation-3: -A rectified linear unit (ReLU) is an activation function that introduces the property of non-linearity to a deep learning model and solves the vanishing gradients issue. “It interprets the positive part of its argument. It is one of the most popular activation functions in deep learning.

There is 1 question to complete.