MCQ IN COMPUTER SCIENCE & ENGINEERING

COMPUTER SCIENCE AND ENGINEERING

MACHINE LEARNING

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
A perceptron can correctly classify instances into two classes where the classes are:
A
Overlapping
B
Linearly separable
C
Non-linearly separable
D
None of the above
Explanation: 

Detailed explanation-1: -If two classes are linearly inseparable, can perceptron convergence theorem be applied? Explanation: Perceptron convergence theorem can only be applied, if and only if two classses are linearly separable.

Detailed explanation-2: -Yes, the perceptron learning algorithm is a linear classifier. If your data is separable by a hyperplane, then the perceptron will always converge. It will never converge if the data is not linearly separable.

Detailed explanation-3: -The Perceptron is a linear classification algorithm. This means that it learns a decision boundary that separates two classes using a line (called a hyperplane) in the feature space.

Detailed explanation-4: -Limitation of Perceptron Model It can only be used to classify the linearly separable sets of input vectors. If the input vectors are non-linear, it is not easy to classify them correctly.

Detailed explanation-5: -If a data set is linearly separable, the Perceptron will find a separating hyperplane in a finite number of updates. (If the data is not linearly separable, it will loop forever.) The argument goes as follows: Suppose ∃w∗ such that yi(x⊤w∗)>0 ∀(xi, yi)∈D.

There is 1 question to complete.