(2.17)
But
is just the corrected sum of squares of the response observations, so
The residual sum of squares has n − 2 degrees of freedom, because two degrees of freedom are associated with the estimates
The quantity MSRes is called the residual mean square. The square root of
Because
Example 2.2 The Rocket Propellant Data
To estimate σ2 for the rocket propellant data in Example 2.1, first find
From Eq. (2.18) the residual sum of squares is
Therefore, the estimate of σ2 is computed from Eq. (2.19) as
Remember that this estimate of σ2 is model dependent. Note that this differs slightly from the value given in the Minitab output (Table 2.3) because of rounding.
2.2.4 Alternate Form of the Model
There is an alternate form of the simple linear regression model that is occasionally useful. Suppose that we redefine the regressor variable xi as the deviation from its own average, say
Note that redefining the regressor variable in Eq. (2.20) has shifted the origin of the x’s from zero to
(2.21)
It is easy to show that the least-squares estimator of the transformed intercept is
Although Eqs. (2.22) and (2.8) are equivalent (they both produce the same value of
2.3 HYPOTHESIS TESTING ON THE SLOPE AND INTERCEPT
We are often interested in testing hypotheses and constructing confidence intervals about the model parameters. Hypothesis testing is discussed in this section, and Section 2.4 deals with confidence intervals. These procedures require that we make the additional assumption that the model errors εi are normally distributed. Thus, the complete assumptions are that the errors are normally and independently distributed with mean 0 and variance σ2, abbreviated NID(0, σ2). In Chapter 4 we discuss how these assumptions can be checked through residual analysis.
2.3.1 Use of t Tests
Suppose that we wish to test the hypothesis that the slope equals a constant, say β10. The appropriate hypotheses are