APPLICATION OF SUPERVISED LEARNING
DEEP LEARNING
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
Which of the following is not an activation function?
|
ReLU
|
|
Tanh
|
|
Sigmoid
|
|
SGD
|
Explanation:
Detailed explanation-1: -Hence, Beta is not an activation function.
Detailed explanation-2: -Sigmoid Function in Artificial Neural Networks The sigmoid function is commonly used as an activation function in artificial neural networks. In feedforward neural networks, the sigmoid function is applied to each neuron’s output, allowing the network to introduce non-linearity into the model.
Detailed explanation-3: -Binary Step Function. Linear Activation Function. Sigmoid/Logistic Activation Function. The derivative of the Sigmoid Activation Function. Tanh Function (Hyperbolic Tangent) Gradient of the Tanh Activation Function. ReLU Activation Function. The Dying ReLU problem. More items
There is 1 question to complete.