COMPUTER SCIENCE AND ENGINEERING
MACHINE LEARNING
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
|
further splitting is required
|
|
no further splitting is required
|
|
Need some other information to decide splitting
|
|
None of the Mentioned
|
Detailed explanation-1: -First, when implementing a decision tree, if entropy(parent) is zero, there is no reason to compute the Information Gain of children, since the data are already perfectly classified (i.e., you are at a leaf node of the tree).
Detailed explanation-2: -Let’s consider a case when all observations belong to the same class; then entropy will always be 0. When entropy becomes 0, then the dataset has no impurity. Datasets with 0 impurities are not useful for learning.
Detailed explanation-3: -It is the number of bits saved when transforming the dataset. The conditional entropy can be calculated by splitting the dataset into groups for each observed value of a and calculating the sum of the ratio of examples in each group out of the entire dataset multiplied by the entropy of each group.