MCQ IN COMPUTER SCIENCE & ENGINEERING

COMPUTER SCIENCE AND ENGINEERING

MACHINE LEARNING

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
If we didn’t assign a base estimators to the bagging classifier it will use by default:
A
Linear regression
B
Decision tree
C
KNN
D
Logistic regression
Explanation: 

Detailed explanation-1: -Because bagging and boosting each rely on collections of classifiers, they’re known as ‘ensemble’ methods.

Detailed explanation-2: -Bagging is a method of merging the same type of predictions. Boosting is a method of merging different types of predictions. Bagging decreases variance, not bias, and solves over-fitting issues in a model. Boosting decreases bias, not variance.

Detailed explanation-3: -(2 points) What is the biggest weakness of decision trees compared to logistic regression classifiers? b. Answer: Decision trees are more likely to overfit the data since they can split on many different combination of features whereas in logistic regression we associate only one parameter with each feature.

Detailed explanation-4: -Bagging meta-estimator. In ensemble algorithms, bagging methods form a class of algorithms which build several instances of a black-box estimator on random subsets of the original training set and then aggregate their individual predictions to form a final prediction.

There is 1 question to complete.