COMPUTER SCIENCE AND ENGINEERING
MACHINE LEARNING
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
|
You will add more features
|
|
You will start introducing higher degree features
|
|
You will remove some features
|
|
None of the above.
|
Detailed explanation-1: -In such a situation, which of the following options would you consider? In case of underfitting, you need to induce more variables in variable space or you can add some polynomial degree variables to make the model more complex to be able to fit the data better.
Detailed explanation-2: -There are four assumptions associated with a linear regression model: Linearity: The relationship between X and the mean of Y is linear. Homoscedasticity: The variance of residual is the same for any value of X. Independence: Observations are independent of each other.
Detailed explanation-3: -Three statistics are used in Ordinary Least Squares (OLS) regression to evaluate model fit: R-squared, the overall F test, and the Root Mean Square Error (RMSE). All three are based on two sums of squares: Sum of Squares Total (SST) and Sum of Squares Error (SSE).
Detailed explanation-4: -Use the least square method to determine the equation of line of best fit for the data.
Detailed explanation-5: -The least-square method, also known as the normal equation, is also one of the most common approaches to solving linear regression models easily.