MCQ IN COMPUTER SCIENCE & ENGINEERING

COMPUTER SCIENCE AND ENGINEERING

MACHINE LEARNING

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
Which of the following are false?
A
If a linear separable decision boundary exists for a classification problem, the perceptron model is capable of finding it.
B
One perceptron van be trained with zero training error for an XOR function.
C
The back-propagation algorithm updates the parameters using gradient descent rule.
D
While training a neural network for binary classification task, an ideal choice for the initialization of parameters should be large random numbers so that the gradient is higher
Explanation: 

Detailed explanation-1: -The answer is (c). The training time depends on the size of the network; the number of neuron is greater and therefore thethe number of possible ‘states’ is increased. Neural networks can be simulated on a conventionalcomputer but the main advantage of neural networks-parallel execution-is lost.

Detailed explanation-2: -Explanation: Neural networks have higher computational rates than conventional computers because a lot of the operation is done in parallel. That is not the case when the neural network is simulated on a computer. The idea behind neural nets is based on the way the human brain works.

Detailed explanation-3: -In a binary classifier, we use the sigmoid activation function with one node. In a multiclass classification problem, we use the softmax activation function with one node per class. In a multilabel classification problem, we use the sigmoid activation function with one node per class.

Detailed explanation-4: -Initializing all the weights with zeros leads the neurons to learn the same features during training. In fact, any constant initialization scheme will perform very poorly. Consider a neural network with two hidden units, and assume we initialize all the biases to 0 and the weights with some constant .

There is 1 question to complete.