MCQ IN COMPUTER SCIENCE & ENGINEERING

COMPUTER SCIENCE AND ENGINEERING

MACHINE LEARNING

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
Which of the following is FALSE about Random Forest and Adaboost?
A
Random Forest aims to decrease variance and not bias
B
Adaboost aims to decrease bias not variance
C
Both Adaboost and Random Forest aim to decrease both bias and variance
D
None of the above
Explanation: 

Detailed explanation-1: -Which of the following is/are true about Random Forest and Gradient Boosting ensemble methods? Both algorithms are design for classification as well as regression task.

Detailed explanation-2: -Answer: The training process of individual trees in a random forest is the same as training a decision tree. Explanation: This is not true.

Detailed explanation-3: -In a random forest, each time you make a tree, you make a full-sized tree. Some trees might be bigger than others, but there is no predetermined maximum depth. In contrast, in a forest of trees made with AdaBoost, the trees are usually just a node and two leaves.

Detailed explanation-4: -It is well known that random forests reduce the variance of the regression predictors compared to a single tree, while leaving the bias unchanged.

There is 1 question to complete.