MCQ IN COMPUTER SCIENCE & ENGINEERING

COMPUTER SCIENCE AND ENGINEERING

MACHINE LEARNING

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
Which of the following is FALSE about Entropy in context of Decision Tree?
A
Entropy keeps on increasing as we keep splitting the nodes
B
Entropy is calculated using Information Gain
C
None of the above
D
None of the above
Explanation: 

Detailed explanation-1: -(B) In a decision tree, the entropy of a node decreases as we go down a decision tree.

Detailed explanation-2: -Entropy is uncertainty/ randomness in the data, the more the randomness the higher will be the entropy. Information gain uses entropy to make decisions. If the entropy is less, information will be more. Information gain is used in decision trees and random forest to decide the best split.

Detailed explanation-3: –Used in white box model. Answer: Random forest tree is used for regression type problem.

Detailed explanation-4: -Expert-Verified Answer (B) The entropy of a node typically decreases as we go down a decision tree is true about a decision tree.

There is 1 question to complete.