MACHINE LEARNING

APPLICATION OF SUPERVISED LEARNING

NEURAL NETWORK

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
What does a ReLu activation function do?
A
Returns 0 or 1 depending on if the number is positive or negative
B
Returns e to the power of the input
C
Returns 0 if the number is negative, and the x=y if the input is positive
D
Returns a number between 0 or 1, based on the absolute value of the input
Explanation: 

Detailed explanation-1: -When most of these neurons return output zero, the gradients fail to flow during backpropagation, and the weights are not updated. Ultimately a large part of the network becomes inactive, and it is unable to learn further.

Detailed explanation-2: -ReLU function is its derivative both are monotonic. The function returns 0 if it receives any negative input, but for any positive value x, it returns that value back. Thus it gives an output that has a range from 0 to infinity.

Detailed explanation-3: -The rectified linear activation function or ReLU for short is a piecewise linear function that will output the input directly if it is positive, otherwise, it will output zero.

There is 1 question to complete.