24 2.24 Repeat Problem 2.23 using only 10 observations in each sample, drawing one observation from each level x = 1, 2, 3, …, 10. What impact does using n = 10 have on the questions asked in Problem 2.23? Compare the lengths of the CIs and the appearance of the histograms.
25 2.25 Consider the simple linear regression model y = β0 + β1x + ε, with E(ε) = 0, Var(ε) = σ2, and ε uncorrelated.a. Show that .b. Show that .
26 2.26 Consider the simple linear regression model y = β0 + β1x + ε, with E(ε) = 0, Var(ε) = σ2, and ε uncorrelated.a. Show that .b. Show that E(MSRes) = σ2.
27 2.27 Suppose that we have fit the straight-line regression model but the response is affected by a second variable x2 such that the true regression function isa. Is the least-squares estimator of the slope in the original simple linear regression model unbiased?b. Show the bias in .
28 2.28 Consider the maximum-likelihood estimator of σ2 in the simple linear regression model. We know that is a biased estimator for σ2.a. Show the amount of bias in .b. What happens to the bias as the sample size n becomes large?
29 2.29 Suppose that we are fitting a straight line and wish to make the standard error of the slope as small as possible. Suppose that the “region of interest” for x is −1 ≤ x ≤ 1. Where should the observations x1, x2, …, xn be taken? Discuss the practical aspects of this data collection plan.
30 2.30 Consider the data in Problem 2.12 and assume that steam usage and average temperature are jointly normally distributed.a. Find the correlation between steam usage and monthly average ambient temperature.b. Test the hypothesis that ρ = 0.c. Test the hypothesis that ρ = 0.5.d. Find a 99% CI for ρ.
31 2.31 Prove that the maximum value of R2 is less than 1 if the data contain repeated (different) observations on y at the same value of x.
32 2.32 Consider the simple linear regression modelwhere the intercept β0 is known.a. Find the least-squares estimator of β1 for this model. Does this answer seem reasonable?b. What is the variance of the slope for the least-squares estimator found in part a?c. Find a 100(1 − α) percent CI for β1. Is this interval narrower than the estimator for the case where both slope and intercept are unknown?
33 2.33 Consider the least-squares residuals , i = 1, 2, …, n, from the simple linear regression model. Find the variance of the residuals Var(ei). Is the variance of the residuals a constant? Discuss.
34 2.34 Consider the baseball regression model from Section 2.8 and assume that wins and ERA are jointly normally distributed.a. Find the correlation between wins and team ERA.b. Test the hypothesis that ρ = 0.c. Test the hypothesis that ρ = 0.5.d. Find a 95% CI for ρ.
35 2.35 Consider the baseball data in Table B.22. Fit a regression model to team wins using total runs scored as the predictor. How does that model compare to the one developed in Section 2.8 using team ERA as the predictor?
36 2.36 Table B.24 contains data on median family home rental price and other data for 51 US cities. Fit a linear regression model using the median home rental price as the response variable and median price per square foot as the predictor variable.a. Test for significance of regression.b. Find a 95% CI on the slope in this model.c. Does this predictor do an adequate job of explaining the variability in home rental prices?
37 2.37 Consider the rental price data in Table B.24. Assume that median home rental price and median price per square foot are jointly normally distributed.a. Find the correlation between home rental price and home price per square foot.b. Test the hypothesis that ρ = 0.c. Test the hypothesis that ρ = 0.5.d. Find a 95% CI for ρ.
38 2.38 You have fit a linear regression model to a sample of 20 observations. The total sum of squares is 100 and the regression sum of squares is 80. The estimate of the error variance isa. 1.5b. 1.2c. 2.0d. 1.88e. None of the above.
39 2.39 You have fit a simple linear regression model to a sample of 25 observations. The value of the t-statistic for testing that the slope is zero is 2.75. An upper bound on the P-value for this test isa. 0.05b. 0.025c. 0.01d. None of the above.
40 2.40 A linear regression model with an intercept term will always pass through the centroid of the data.a. Trueb. False
41 2.41 The variance of the predicted response in a linear regression model is a minimum at the average value of the predictor variable.a. Trueb. False
42 2.42 The confidence interval on the mean response at a particular value of the predictor variable is always wider than the prediction interval on a new observation at the same point.a. Trueb. False
43 2.43 The method of least squares ensures that the estimators of the slope and intercept in a linear regression model are best linear unbiased estimator.a. Trueb. False
44 2.44 For any simple linear regression model that has an intercept, the sum of the residuals is always zero.a. Trueb. False
Конец ознакомительного фрагмента.
Текст предоставлен ООО «ЛитРес».
Прочитайте эту книгу целиком, купив полную легальную версию на ЛитРес.
Безопасно оплатить книгу можно банковской картой Visa, MasterCard, Maestro, со счета мобильного телефона, с платежного терминала, в салоне МТС или Связной, через PayPal, WebMoney, Яндекс.Деньги, QIWI Кошелек, бонусными картами или другим удобным Вам способом.