MCQ IN COMPUTER SCIENCE & ENGINEERING

COMPUTER SCIENCE AND ENGINEERING

MACHINE LEARNING

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
Adding a new feature to the model always results in equal or better performance on the training set?
A
False
B
True
C
Either A or B
D
None of the above
Explanation: 

Detailed explanation-1: -Adding many new features gives us more expressive models which are able to better fit our training set. If too many new features are added, this can lead to overfitting of the training set. Introducing regularization to the model always results in equal or better performance on examples not in the training set.

Detailed explanation-2: -Adding a new feature to the model always results in equal or better performance on the training set. Adding many new features to the model helps prevent overfitting on the training set.

Detailed explanation-3: -If your model is underfitting the training set, then obtaining more data is likely to help. If the model is underfitting the training data, it has not captured the information in the examples you already have. Adding further examples will not help any more.

Detailed explanation-4: -If the training and test errors are about the same, adding more features will not help improve the results. Training and test errors are about the same means model is facing high bias problem. Adding more features will help in solving high bias problem.

Detailed explanation-5: -Q4. Which of the following statements about scaling features prior to regularization is TRUE? The scale or features must be the same to implement L1 or L2 regularization.

There is 1 question to complete.