COMPUTER SCIENCE AND ENGINEERING
MACHINE LEARNING
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
|
Order of trees matters in Random Forest
|
|
Order of trees matters in Adaboost
|
|
Order of trees does not matter in Random Forest and Adaboost
|
|
None of the above
|
Detailed explanation-1: -Which of the following is/are true about Random Forest and Gradient Boosting ensemble methods? Both algorithms are design for classification as well as regression task.
Detailed explanation-2: -Here are two main differences between Random Forest and AdaBoost: Decision Trees in Random Forest have equal contributions to the final prediction, whereas Decision Stumps in AdaBoost have different contributions, i.e. some stumps have more say than others.
Detailed explanation-3: -QUESTION 1 Which of the following statements is true about random forests Random forests are an ensemble method. They combine and average the predictions from large number of trees.
Detailed explanation-4: -Which of the following statements is/are correct regarding AdaBoost?-Adaboost builds weak learners (decision trees) with restricted depth. Since the decision trees are weak learners, therefore they are mainly one-step or two-step decision trees (restricted depth).