COMPUTER SCIENCE AND ENGINEERING
MACHINE LEARNING
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
|
Number of nearest neighbors in KNN
|
|
Number of outliers in KNN
|
|
Depth of a Decision Tree
|
|
Number of trees in a Random Forest
|
Detailed explanation-1: -The learning rate is not a hyperparameter in a random forest. An increase in the number of trees will cause under fitting.
Detailed explanation-2: -K-Nearest Neighbors (KNN) The most important hyperparameter for KNN is the number of neighbors (n neighbors). Test values between at least 1 and 21, perhaps just the odd numbers. It may also be interesting to test different distance metrics (metric) for choosing the composition of the neighborhood.
Detailed explanation-3: -The method is based on the distance between the object and its kNNs: given k and n, a point is an outlier if no more than n−1 other points in the data set have a higher value for D k than p.
Detailed explanation-4: -Random Forest and Extra Trees don’t have learning rate as a hyperparameter.