COMPUTER SCIENCE AND ENGINEERING
MACHINE LEARNING
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
|
Random Forest aims to decrease variance and not bias
|
|
Adaboost aims to decrease bias not variance
|
|
Both Adaboost and Random Forest aim to decrease both bias and variance
|
|
None of the above
|
Detailed explanation-1: -Which of the following is/are true about Random Forest and Gradient Boosting ensemble methods? Both algorithms are design for classification as well as regression task.
Detailed explanation-2: -Answer: The training process of individual trees in a random forest is the same as training a decision tree. Explanation: This is not true.
Detailed explanation-3: -In a random forest, each time you make a tree, you make a full-sized tree. Some trees might be bigger than others, but there is no predetermined maximum depth. In contrast, in a forest of trees made with AdaBoost, the trees are usually just a node and two leaves.
Detailed explanation-4: -It is well known that random forests reduce the variance of the regression predictors compared to a single tree, while leaving the bias unchanged.