Introduction to Linear Regression Analysis. Douglas C. Montgomery. Читать онлайн. Newlib. NEWLIB.NET

Автор: Douglas C. Montgomery
Издательство: John Wiley & Sons Limited
Серия:
Жанр произведения: Математика
Год издания: 0
isbn: 9781119578758
Скачать книгу

      (2.17) image

      But

ueqn21-1

      is just the corrected sum of squares of the response observations, so

      The residual sum of squares has n − 2 degrees of freedom, because two degrees of freedom are associated with the estimates in21-1 and in21-2 involved in obtaining in21-3. Section C.3 shows that the expected value of SSRes is E(SSRes) = (n − 2)σ2, so an unbiased estimator of σ2 is

      The quantity MSRes is called the residual mean square. The square root of in21-4 is sometimes called the standard error of regression, and it has the same units as the response variable y.

      Because in21-5 depends on the residual sum of squares, any violation of the assumptions on the model errors or any misspecification of the model form may seriously damage the usefulness of in21-6 as an estimate of σ2. Because in21-7 is computed from the regression model residuals, we say that it is a model-dependent estimate of σ2.

      Example 2.2 The Rocket Propellant Data

ueqn21-2 ueqn21-3 ueqn21-4

      There is an alternate form of the simple linear regression model that is occasionally useful. Suppose that we redefine the regressor variable xi as the deviation from its own average, say in22-1. The regression model then becomes

      (2.21) image

      It is easy to show that the least-squares estimator of the transformed intercept is in22-3. The estimator of the slope is unaffected by the transformation. This alternate form of the model has some advantages. First, the least-squares estimators in22-4 and in22-5 are uncorrelated, that is, in22-6. This will make some applications of the model easier, such as finding confidence intervals on the mean of y (see Section 2.4.2). Finally, the fitted model is

      We are often interested in testing hypotheses and constructing confidence intervals about the model parameters. Hypothesis testing is discussed in this section, and Section 2.4 deals with confidence intervals. These procedures require that we make the additional assumption that the model errors εi are normally distributed. Thus, the complete assumptions are that the errors are normally and independently distributed with mean 0 and variance σ2, abbreviated NID(0, σ2). In Chapter 4 we discuss how these assumptions can be checked through residual analysis.

      2.3.1 Use of t Tests

      Suppose that we wish to test the hypothesis that the slope equals a constant, say β10. The appropriate hypotheses are