COMPUTER SCIENCE AND ENGINEERING
MACHINE LEARNING
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
|
Bias
|
|
RMSE
|
|
Variance
|
|
None of the above
|
Detailed explanation-1: -Y – actual input and Y‘ – predicted output. For obvious reasons, this cost function is also known as the squared error function. It is the most commonly used cost function in linear Regression because it is simple and works well.
Detailed explanation-2: -The cost function of a linear regression is root mean squared error or mean squared error. They are both the same; just we square it so that we don’t get negative values.
Detailed explanation-3: -The most common metric for evaluating linear regression model performance is called root mean squared error, or RMSE. The basic idea is to measure how bad/erroneous the model’s predictions are when compared to actual observed values. So a high RMSE is “bad” and a low RMSE is “good”.
Detailed explanation-4: -In model fitting, a cost function may want to combine both the Mean Absolute Error (MAE) and Root Mean Square Error (RMSE). Cost functions with multiple terms need a way to balance the two terms, according to how much you value one over the other.
Detailed explanation-5: -For linear regression, this MSE is nothing but the Cost Function. Mean Squared Error is the sum of the squared differences between the prediction and true value. And the output is a single number representing the cost.