APPLICATION OF SUPERVISED LEARNING
DEEP LEARNING
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
|
Ture
|
|
False
|
|
Either A or B
|
|
None of the above
|
Detailed explanation-1: -Pooling layers does not have parameters which affect the back propagation. Backpropagation is an algorithm to efficiently implement gradient descent in a neural network by using the chain rule and propagating the error from the final layers to the early ones.
Detailed explanation-2: -Backward Propagation For the backward in a max pool layer, we pass of the gradient, we start with a zero matrix and fill the max index of this matrix with the gradient from above. On the other hand, if we tread it as an average pool layer, we need to fill each cell with the value of the gradient from above.
Detailed explanation-3: -There are no trainable parameters in a max-pooling layer. In the forward pass, it pass maximum value within each rectangle to the next layer. In the backward pass, it propagate error in the next layer to the place where the max value is taken, because that’s where the error comes from. Save this answer.