The Art of Mathematics in Business. Dr Jae K Shim. Читать онлайн. Newlib. NEWLIB.NET

Автор: Dr Jae K Shim
Издательство: Ingram
Серия:
Жанр произведения: Экономика
Год издания: 0
isbn: 9781908287113
Скачать книгу
key to success in using this method.

      The method is simple and effective, since it does not require a lot of data other than for the variable involved. One disadvantage of the method, however, is that it does not include industrial or economic factor such as market conditions, prices, or the effects of competitors’ actions.

      Introduction

      Regression analysis is a statistical procedure for estimating mathematically the average relationship between the dependent variable and the independent variable(s). The least-squares method is widely used in regression analysis for estimating the parameter values in a regression equation. Simple regression involves one independent variable, price or advertising in a demand function, whereas multiple regression involves two or more variables, that is price and advertising together.

      How is it computed?

      We will assume a simple (linear) regression to illustrate the least-squares method, which means that we will assume the Y = a + bX relationship, where a = intercept and b = slope. The regression method includes all the observed data and attempts to find a line of best fit. To find this line, a technique called the least-squares method is used.

      The Least-Squares Method

      To explain the least-squares method, we define the error as the difference between the observed value and the estimated one and denote it with u. Symbolically,

      u = Y - Y′

      whereY = observed value of the dependent variable

      Y′ = estimated value based on Y′ = a + bX

      The least-squares criterion requires that the line of best fit be such that the sum of the squares of the errors (or the vertical distance in Figure 19.1 from the observed data points to the line) is a minimum, i.e.,

      Minimum: Σu2 = ∑(y − y′)2= Σ(y-a-bX)2

      Using differential calculus we obtain the following equations, called normal equations:

      ΣY = na + bΣX

      ΣXY = aΣX + bΣX2

      Solving the equations for b and a yields

image image

      Example 1

      To illustrate the computations of b and a, we will refer to the data in Table 19.1. All the sums required are computed and shown in Table 19.1.

image

      From the table:

      ΣX = 174; ΣY = 225; ΣXY = 3,414; ΣX2 = 2,792.

      image = Σx/n = 174/12 = 14.5;Y = ΣY/n = 225/12 = 18.75.

      Substituting these values into the formula for b first:

image

      a = image - bimage = 18.75 - (0.5632)(14.5) = 18.75 - 8.1664 = 10.5836

      Thus,

      Y′ = 10.5836 + 0.5632 X

      Can a computer help?

      Spreadsheet programs such as Excel include a regression routine which can be used without any difficulty. As a matter of fact, in reality, you do not compute the parameter values a and b manually. Table 19.2 shows an Excel regression output that contains the statistics we discussed so far. Other statistics that appear are discussed in Sec. 20, Regression Statistics.

image image

      (1) R-squared (r2) = .608373 = 60.84%

      (2) Standard error of the estimate (Se) = 2.343622

      (3) Standard error of the coefficient (Sb) = 0.142893

      (4) t-value = 3.94

      Note that all of the above are the same as the ones manually obtained.

      How is it used and applied?

      Before attempting a least-squares regression approach, it is extremely important to plot the observed data on a diagram, called the scattergraph (See Figure 19.3). The reason is that you might want to make sure that a linear (straight-line) relationship existed between Y and X in the past sample. If for any reason there was a nonlinear relationship detected in the sample, the linear relationship we assumed -- Y = a + bX -- would not give us a good fit.

      Example 2

      Assume that the advertising of $10 is to be expended for next year; the projected sales for the next year would be computed as follows:

      Y′ = 10.5836 + 0.5632 X

      = 10.5836 + 0.5632 (10)

      = $ 16.2156

      In order to obtain a good fit and achieve a high degree of accuracy, you should be familiar with statistics relating to regression such as r-squared (R2) and t-value, which are discussed later.

image

      Introduction

      Regression analysis is a statistical procedure for estimating mathematically the average relationship between the dependent variable (e.g., sales) and the independent variable(s) (e.g., price, advertising, or both). It uses a variety of statistics to convey the accuracy and reliability of the regression results.

      How is it computed?

      Regression statistics include:

      1.Correlation coefficient (r) and coefficient of determination (r2)

      2.Standard error of the estimate (Se)

      3.Standard error of the regression coefficient (Sb) and t statistics

      1. Correlation Coefficient (r) and Coefficient of Determination (r2)

      The correlation coefficient r measures the degree of correlation between Y and X. The range of values it takes on is between - 1 and + 1. More widely used, however, is the coefficient of determination, designated r2 (read as r-squared). Simply put, r2 tells the level of quality of the estimated regression equation--a measure of “goodness of fit” in the regression. Therefore, the higher the r2 , the more confidence can be placed in the estimated equation.

      More specifically, the coefficient of determination represents the proportion of the total variation in Y that is explained