APPLICATION OF SUPERVISED LEARNING
SUPERVISED AND UNSUPERVISED LEARNING
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
|
AUC
|
|
Precision
|
|
Predicted vs True Chart
|
|
F1 Score
|
Detailed explanation-1: -F1 Score-It gives a combined idea about Precision and Recall metrics. It is maximum when Precision is equal to Recall. F1 Score is the harmonic mean of precision and recall.
Detailed explanation-2: -The key classification metrics: Accuracy, Recall, Precision, and F1-Score. The difference between Recall and Precision in specific cases. Decision Thresholds and Receiver Operating Characteristic (ROC) curve.
Detailed explanation-3: -The confusion matrix is not a metric by itself but the base for multiple metrics to evaluate classification algorithms. Confusion Matrices are simply mapping matrices that compare the outcome of your algorithm and the true label of a specific target.
Detailed explanation-4: -The F1 score is commonly used to measure performance of binary classification, but extensions to multi-class classifications exist.