2.2.2 Properties of the Least-Squares Estimators and the Fitted Regression Model
The least-squares estimators
where
The least-squares estimators
since E(εi) = 0 by assumption. Now we can show directly that
That is, if we assume that the model is correct [E(yi) = β0 + β1xi], then
The variance of
(2.13)
because the observations yi are uncorrelated, and so the variance of the sum is just the sum of the variances. The variance of each term in the sum is
(2.14)
The variance of
Now the variance of
(2.15)
Another important result concerning the quality of the least-squares estimators
There are several other useful properties of the least-squares fit:
1 The sum of the residuals in any regression model that contains an intercept β0 is always zero, that is,This property follows directly from the first normal equation in Eqs. (2.5) and is demonstrated in Table 2.2 for the residuals from Example 2.1. Rounding errors may affect the sum.
2 The sum of the observed values yi equals the sum of the fitted values , orTable 2.2 demonstrates this result for Example 2.1.
3 The least-squares regression line always passes through the centroid [the point ] of the data.
4 The sum of the residuals weighted by the corresponding value of the regressor variable always equals zero, that is,
5 The sum of the residuals weighted by the corresponding fitted value always equals zero, that is,
2.2.3 Estimation of σ2
In addition to estimating β0 and β1, an estimate of σ2 is required to test hypotheses and construct interval estimates pertinent to the regression model. Ideally we would like this estimate not to depend on the adequacy of the fitted model. This is only possible when there are several observations on y for at least one value of x (see Section 4.5) or when prior information concerning σ2 is available. When this approach cannot be used, the estimate of σ2 is obtained from the residual or error sum of squares,
A convenient computing formula for SSRes may be found by substituting