DATABASE FUNDAMENTALS
BASICS OF BIG DATA
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
|
TRUE
|
|
False
|
|
Either A or B
|
|
None of the above
|
Detailed explanation-1: -When we have large test error and large training error then we say it a BIAS problem. When we have low training error and high test error then we say it VARIANCE problem. When both training error and test error are enough low for being acceptable we say it GOOD fit or BEST fit model.
Detailed explanation-2: -Your model is overfitting your training data when you see that the model performs well on the training data but does not perform well on the evaluation data. This is because the model is memorizing the data it has seen and is unable to generalize to unseen examples.
Detailed explanation-3: -1. The statement C is not true about overfitting. In fact, the trained model may perform poorly on new records in cases where the data used for training and testing is similar.
Detailed explanation-4: -If a model has been trained too well on training data, it will be unable to generalize. It will make inaccurate predictions when given new data, making the model useless even though it is able to make accurate predictions for the training data. This is called overfitting.