DATABASE FUNDAMENTALS
BASICS OF BIG DATA
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
|
1 and 2
|
|
2 and 3
|
|
1 and
|
|
1, 2 and 3
|
Detailed explanation-1: -An increase in K will result in a higher time required to cross-validate the result. Higher values of K will result in higher confidence in the cross-validation result as compared to a lower value of K. If K=N, then it is called Leave one out cross validation, where N is the number of observations.
Detailed explanation-2: -Which of the following is correct use of cross validation? Explanation: Cross-validation is also used to pick type of prediction function to be used.
Detailed explanation-3: -k-fold cross classification is about estimating the accuracy, not improving the accuracy.
Detailed explanation-4: -K-Folds cross validation is one method that attempts to maximize the use of the available data for training and then testing a model. It is particularly useful for assessing model performance, as it provides a range of accuracy scores across (somewhat) different data sets.