MACHINE LEARNING

APPLICATION OF SUPERVISED LEARNING

CLASSIFICATION IN MACHINE LEARNING

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
What is pca components
A
Set of all eigen vectors for the projection space
B
Matrix of principal components
C
Result of the multiplication matrix
D
None of the above options
Explanation: 

Detailed explanation-1: -Eigenvalues represent the total amount of variance that can be explained by a given principal component. They can be positive or negative in theory, but in practice they explain variance which is always positive. If eigenvalues are greater than zero, then it’s a good sign.

Detailed explanation-2: -Projection Data The last step of PCA is we need to multiply Q tranpose of Q with the original data matrix in order to get the projection matrix. We go from the (d x k) Q matrix and Q transpose of Q results in d x d dimension. By multiplying the (d x n) X matrix, the projection matrix is d x n.

There is 1 question to complete.