APPLICATION OF SUPERVISED LEARNING
NEURAL NETWORK
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
Each layer in NN is connected to the next layer through:
|
bias
|
|
neuron
|
|
weight
|
|
activation function
|
Explanation:
Detailed explanation-1: -A fully connected layer multiplies the input by a weight matrix and then adds a bias vector. The convolutional (and down-sampling) layers are followed by one or more fully connected layers. As the name suggests, all neurons in a fully connected layer connect to all the neurons in the previous layer.
Detailed explanation-2: -For the fully-connected architecture, I have used a total of three hidden layers with ‘relu’ activation function apart from input and output layers. The total number of trainable parameters is around 0.3 million.
There is 1 question to complete.