APPLICATION OF SUPERVISED LEARNING
NEURAL NETWORK
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
which of the following is not an activation function of NN
|
Sigmoidal Function
|
|
Hyperbolic Tangent Function (Tanh)
|
|
Triangular Function
|
|
Rectified Linear Unit (ReLU) Function
|
Explanation:
Detailed explanation-1: -What is a Neural Network Activation Function? An Activation Function decides whether a neuron should be activated or not. This means that it will decide whether the neuron’s input to the network is important or not in the process of prediction using simpler mathematical operations.
Detailed explanation-2: -There are perhaps three activation functions you may want to consider for use in hidden layers; they are: Rectified Linear Activation (ReLU) Logistic (Sigmoid) Hyperbolic Tangent (Tanh)
Detailed explanation-3: -Binary Step Function. Linear Function. Sigmoid. Tanh. ReLU. Leaky ReLU. Parameterised ReLU. Exponential Linear Unit. More items •30-Jan-2020
There is 1 question to complete.