COMPUTER SCIENCE AND ENGINEERING
MACHINE LEARNING
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
|
Random Forest aims to decrease both variance and bias
|
|
Random Forest aims to decrease variance not bias
|
|
Random Forest aims to decrease bias not variance
|
|
None of the above
|
Detailed explanation-1: -It is well known that random forests reduce the variance of the regression predictors compared to a single tree, while leaving the bias unchanged.
Detailed explanation-2: -QUESTION 1 Which of the following statements is true about random forests Random forests are an ensemble method. They combine and average the predictions from large number of trees.
Detailed explanation-3: -One way Random Forests reduce variance is by training on different samples of the data. A second way is by using a random subset of features. This means if we have 30 features, random forests will only use a certain number of those features in each model, say five.
Detailed explanation-4: -Both Bagging and Random Forests use Bootstrap sampling, and as described in “Elements of Statistical Learning", this increases bias in the single tree. Furthermore, as the Random Forest method limits the allowed variables to split on in each node, the bias for a single random forest tree is increased even more.