MACHINE LEARNING

APPLICATION OF SUPERVISED LEARNING

DEEP LEARNING

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
A = 1/(1 + e-x) is an equation representing which activation function?
A
ReLU
B
Sigmoid
C
Leaky ReLu
D
Tanh
Explanation: 

Detailed explanation-1: -A is equal to 1/(1 + e-x). Non-linear in nature.

Detailed explanation-2: -Sigmoid / Logistic Activation Function This function takes any real value as input and outputs values in the range of 0 to 1. The larger the input (more positive), the closer the output value will be to 1.0, whereas the smaller the input (more negative), the closer the output will be to 0.0, as shown below.

Detailed explanation-3: -ReLU formula is : f(x) = max(0, x) As a result, the output has a range of 0 to infinite. ReLU is the most often used activation function in neural networks, especially CNNs, and is utilized as the default activation function.

There is 1 question to complete.