MACHINE LEARNING

APPLICATION OF SUPERVISED LEARNING

MACHINE LEARNINGHARD QUESTIONS

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
In Decision Tree algorithms entropy of a given dataset is zero. This statement implies ____
A
Further splitting is required
B
Need some other information to decide splitting
C
No further splitting is required
D
None of the Mentioned
Explanation: 

Detailed explanation-1: -Entropy is measured between 0 and 1. (Depending on the number of classes in your dataset, entropy can be greater than 1 but it means the same thing, a very high level of disorder.

Detailed explanation-2: -Let’s consider a case when all observations belong to the same class; then entropy will always be 0. E=−(1log21) = 0. When entropy becomes 0, then the dataset has no impurity. Datasets with 0 impurities are not useful for learning.

Detailed explanation-3: -In the context of Decision Trees, entropy is a measure of disorder or impurity in a node. Thus, a node with more variable composition, such as 2Pass and 2 Fail would be considered to have higher Entropy than a node which has only pass or only fail.

There is 1 question to complete.