0% found this document useful (0 votes)
3 views46 pages

Chap14_Introdution to Multiple Regression_ans (1)

Chapter 14 of 'Statistics for Business and Economics' focuses on multiple regression analysis, enabling readers to apply this technique in business decision-making and interpret computer output. It covers the formulation of the multiple regression model, assumptions, estimation of coefficients, and evaluation of individual regression coefficients through hypothesis testing. An example involving pie sales illustrates the application of multiple regression with independent variables such as price and advertising.

Uploaded by

ngannnss180998
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views46 pages

Chap14_Introdution to Multiple Regression_ans (1)

Chapter 14 of 'Statistics for Business and Economics' focuses on multiple regression analysis, enabling readers to apply this technique in business decision-making and interpret computer output. It covers the formulation of the multiple regression model, assumptions, estimation of coefficients, and evaluation of individual regression coefficients through hypothesis testing. An example involving pie sales illustrates the application of multiple regression with independent variables such as price and advertising.

Uploaded by

ngannnss180998
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 46

Statistics for

Business and Economics


6th Edition

Chapter 14

Multiple Regression

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Chapter Goals
After completing this chapter, you should be able to:
 Apply multiple regression analysis to business decision-
making situations
 Analyze and interpret the computer output for a multiple
regression model
 Perform a hypothesis test for all regression coefficients
or for a subset of coefficients
 Fit and interpret nonlinear regression models
 Incorporate qualitative variables into the regression
model by using dummy variables

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


The Multiple Regression
Model

Idea: Examine the linear relationship between


1 dependent (Y) & 2 or more independent variables (Xi)

Multiple Regression Model with k Independent Variables:

Y-intercept Population slopes Random Error

Y β0  β1X1  β 2 X 2    βk Xk  ε

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Multiple Regression Equation

The coefficients of the multiple regression model are


estimated using sample data

Multiple regression equation with k independent variables:


Estimated Estimated
(or predicted) Estimated slope coefficients
intercept
value of y

yˆ i b0  b1x1i  b 2 x 2i    bk x ki
In this chapter we will always use a computer to obtain the
regression slope coefficients and other regression
summary measures.
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Multiple Regression Equation
(continued)
Two variable model
y
yˆ b0  b1x1  b 2 x 2

x1
e
abl
ri
r va
fo
l ope x2
S
f or v ariable x 2
Slope

x1
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Standard Multiple Regression
Assumptions

 The values xi and the error terms εi are


independent

 The error terms are random variables with


mean 0 and a constant variance, 2.

E[ε i ] 0 and E[ε i2 ] σ 2 for (i 1,  , n)

(The constant variance property is called


homoscedasticity)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Standard Multiple Regression
Assumptions
(continued)

 The random error terms, εi , are not correlated


with one another, so that
E[ε iε j ] 0 for all i  j

 It is not possible to find a set of numbers, c 0,


c1, . . . , ck, such that
c 0  c 1x1i  c 2 x 2i    c K x Ki 0

(This is the property of no linear relation for


the Xj’s)
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Example:
2 Independent Variables
 A distributor of frozen desert pies wants to
evaluate factors thought to influence demand
 Dependent variable: Pie sales (units per week)
 Independent variables: Price (in $)
Advertising
($100’s)

 Data are collected for 15 weeks

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Pie Sales Example
Pie Price Advertising
Week Sales ($) ($100s) Multiple regression equation:
1 350 5.50 3.3
2 460 7.50 3.3
Sales = b0 + b1 (Price)
3 350 8.00 3.0
4 430 8.00 4.5 + b2 (Advertising)
5 350 6.80 3.0
6 380 7.50 4.0
7 430 4.50 3.0
8 470 6.40 3.7
9 450 7.00 3.5
10 490 5.00 4.0
11 340 7.20 3.5
12 300 7.90 3.2
13 440 5.90 4.0
14 450 5.00 3.5
15 300 7.00 2.7
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Estimating a Multiple Linear
Regression Equation
 Excel will be used to generate the coefficients
and measures of goodness of fit for multiple
regression

 Excel:
 Tools / Data Analysis... / Regression
 PHStat:
 PHStat / Regression / Multiple Regression…

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Multiple Regression Output
Regression Statistics
Multiple R 0.72213
R Square 0.52148
Adjusted R Square 0.44172
Standard Error 47.46341 Sales 306.526 - 24.975(Price)  74.131(Adv ertising)
Observations 15

ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


The Multiple Regression Equation

Sales 306.526 - 24.975(Price)  74.131(Adv ertising)


where
Sales is in number of pies per week
Price is in $
Advertising is in $100’s.
b1 = -24.975: sales b2 = 74.131: sales will
will decrease, on increase, on average,
average, by 24.975 by 74.131 pies per
pies per week for week for each $100
each $1 increase in increase in
selling price, net of advertising, net of the
the effects of effects of changes
changes due to due to price
advertising
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Coefficient of Determination, R2

 Reports the proportion of total variation in y


explained by all x variables taken together

SSR regression sum of squares


2
R  
SST total sum of squares

 This is the ratio of the explained variability to


total sample variability

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Coefficient of Determination, R2
(continued)
Regression Statistics
SSR 29460.0
2
Multiple R 0.72213
R   .52148
R Square 0.52148 SST 56493.3
Adjusted R Square 0.44172
Standard Error 47.46341 52.1% of the variation in pie sales
Observations 15 is explained by the variation in
price and advertising
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Estimation of Error Variance
 Consider the population regression model

Y i  β 0  β 1x 1i  β 2 x 2i    β K x Ki  ε i

 The unbiased estimate of the variance of the errors is


n

 i
e 2
SSE
s2e  i1

n K  1 n K  1

ei y i  yˆ i
where
 The square root of the variance, se , is called the
standard error of the estimate
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Standard Error, se
Regression Statistics
Multiple R 0.72213
R Square 0.52148
se 47.463
Adjusted R Square 0.44172
The magnitude of this
Standard Error 47.46341
value can be compared to
Observations 15
the average y value
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Adjusted Coefficient of
Determination, R 2
 R2 never decreases when a new X variable is
added to the model, even if the new variable is
not an important predictor variable
 This can be a disadvantage when comparing
models
 What is the net effect of adding a new variable?
 We lose a degree of freedom when a new X
variable is added
 Did the new X variable add enough
explanatory power to offset the loss of one
degree of freedom?
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Adjusted Coefficient of
Determination, R 2
(continued)
 Used to correct for the fact that adding non-relevant
independent variables will still reduce the error sum of
squares
2 SSE / (n  K  1)
R 1
SST / (n  1)
(where n = sample size, K = number of independent variables)

 Adjusted R2 provides a better comparison between


multiple regression models with different numbers of
independent variables
 Penalize excessive use of unimportant independent
variables
 Smaller than R2
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
2
R
Regression Statistics
2
Multiple R 0.72213 R .44172
R Square 0.52148
Adjusted R Square 0.44172
44.2% of the variation in pie sales is
Standard Error 47.46341 explained by the variation in price and
Observations 15 advertising, taking into account the sample
size and number of independent variables
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Coefficient of Multiple
Correlation
 The coefficient of multiple correlation is the correlation
between the predicted value and the observed value of
the dependent variable

R r(yˆ , y)  R 2
 Is the square root of the multiple coefficient of
determination
 Used as another measure of the strength of the linear
relationship between the dependent variable and the
independent variables
 Comparable to the correlation between Y and X in
simple regression
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Evaluating Individual
Regression Coefficients

 Use t-tests for individual coefficients


 Shows if a specific independent variable is
conditionally important
 Hypotheses:
 H0: βj = 0 (no linear relationship)
 H1: βj ≠ 0 (linear relationship does exist
between xj and y)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Evaluating Individual
Regression Coefficients
(continued)

H0: βj = 0 (no linear relationship)


H1: βj ≠ 0 (linear relationship does exist
between xj and y)

Test Statistic:
bj  0
t (df = n – k – 1)
Sb j
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Evaluating Individual
Regression Coefficients
(continued)
Regression Statistics
Multiple R 0.72213
t-value for Price is t = -2.306, with
R Square 0.52148
p-value .0398
Adjusted R Square 0.44172
Standard Error 47.46341 t-value for Advertising is t = 2.855,
Observations 15 with p-value .0145

ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Example: Evaluating Individual
Regression Coefficients
From Excel output:
H0: βj = 0
Coefficients Standard Error t Stat P-value
H1: βj  0 Price -24.97509 10.83213 -2.30565 0.03979
Advertising 74.13096 25.96732 2.85478 0.01449
d.f. = 15-2-1 = 12
 = .05 The test statistic for each variable falls
t12, .025 = 2.1788 in the rejection region (p-values < .05)
Decision:
/2=.025 /2=.025 Reject H0 for each variable
Conclusion:
There is evidence that both
Reject H0
-tα/2
Do not reject H0
tα/2
Reject H0
Price and Advertising affect
0
-2.1788 2.1788 pie sales at  = .05
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Confidence Interval Estimate
for the Slope
Confidence interval limits for the population slope βj

b j t n K  1,α/2Sb j where t has


(n – K – 1) d.f.

Coefficients Standard Error


Intercept 306.52619 114.25389 Here, t has
Price -24.97509 10.83213
(15 – 2 – 1) = 12 d.f.
Advertising 74.13096 25.96732

Example: Form a 95% confidence interval for the effect of


changes in price (x1) on pie sales:
-24.975 ± (2.1788)(10.832)
So the interval is -48.576 < β1 < -1.374
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Confidence Interval Estimate
for the Slope
(continued)
Confidence interval for the population slope βi

Coefficients Standard Error … Lower 95% Upper 95%


Intercept 306.52619 114.25389 … 57.58835 555.46404
Price -24.97509 10.83213 … -48.57626 -1.37392
Advertising 74.13096 25.96732 … 17.55303 130.70888

Example: Excel output also reports these interval endpoints:


Weekly sales are estimated to be reduced by between 1.37 to
48.58 pies for each increase of $1 in the selling price

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Test on All Coefficients
 F-Test for Overall Significance of the Model
 Shows if there is a linear relationship between all
of the X variables considered together and Y
 Use F test statistic
 Hypotheses:
H0: β1 = β2 = … = βk = 0 (no linear relationship)
H1: at least one βi ≠ 0 (at least one independent
variable affects Y)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


F-Test for Overall Significance
 Test statistic:
MSR SSR/K
F 2 
se SSE/(n  K  1)

where F has k (numerator) and


(n – K – 1) (denominator)
degrees of freedom
 The decision rule is

R e je c t H 0 if F  F k ,n  K  1 , α
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
F-Test for Overall Significance
(continued)
Regression Statistics
Multiple R 0.72213
R Square 0.52148 MSR 14730.0
F  6.5386
Adjusted R Square 0.44172
MSE 2252.8
Standard Error 47.46341
With 2 and 12 degrees P-value for
Observations 15
of freedom the F-Test

ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


F-Test for Overall Significance
(continued)

H0: β1 = β2 = 0 Test Statistic:


H1: β1 and β2 not both zero MSR
F 6.5386
 = .05 MSE
df1= 2 df2 = 12
Decision:
Critical Since F test statistic is in
Value: the rejection region (p-
F = 3.885 value < .05), reject H0
 = .05
Conclusion:
0 F There is evidence that at least one
Do not Reject H0
reject H0 independent variable affects Y
F.05 = 3.885
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Prediction
 Given a population regression model

y i  β 0  β 1 x 1 i  β 2 x 2 i    β K x K i  ε i (i  1 ,2 ,  , n )

 then given a new observation of a data point


(x1,n+1, x 2,n+1, . . . , x K,n+1)
^
the best linear unbiased forecast of yn+1 is

yˆ n  1  b 0  b 1 x 1 , n  1  b 2 x 2 , n  1    b K x K , n  1

 It is risky to forecast for new X values outside the range of the data used
to estimate the model coefficients, because we do not have data to
support that the linear model extends beyond the observed range.

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Using The Equation to Make
Predictions
Predict sales for a week in which the selling
price is $5.50 and advertising is $350:

Sales  306.526 - 24.975(Price)  74.131(Advertising)


 306.526 - 24.975 (5.50)  74.131 (3.5)
 428.62

Note that Advertising is


Predicted sales in $100’s, so $350
means that X2 = 3.5
is 428.62 pies
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Predictions in PHStat
 PHStat | regression | multiple regression …

Check the
“confidence and
prediction interval
estimates” box

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Predictions in PHStat
(continued)

Input values

<
Predicted y value

Confidence interval for the

<
mean y value, given
these x’s

Prediction interval for an

<
individual y value, given
these x’s
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Residuals in Multiple Regression
Two variable model
y Sample
yi observation yˆ b0  b1x1  b 2 x 2
Residual =
<

ei = (yi – yi)
<

yi

x2i
x2
x1i

x1
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
Dummy Variables

 A dummy variable is a categorical independent


variable with two levels:
 yes or no, on or off, male or female
 recorded as 0 or 1
 Regression intercepts are different if the
variable is significant
 Assumes equal slopes for other variables
 If more than two levels, the number of dummy
variables needed is (number of levels - 1)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Dummy Variable Example

yˆ b0  b1x1  b 2 x 2

Let:
y = Pie Sales
x1 = Price
x2 = Holiday (X2 = 1 if a holiday occurred during the week)
(X2 = 0 if there was no holiday that week)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Dummy Variable Example
(continued)

yˆ b0  b1x1  b 2 (1) (b0  b 2 )  b1x1 Holiday

yˆ b0  b1x1  b 2 (0)  b0  b1 x 1 No Holiday

Different Same
intercept slope
y (sales)
If H0: β2 = 0 is
b0 + b2
Holi rejected, then
day
b0 (x2 = “Holiday” has a
No H 1)
olida significant effect
y (x
2 = 0 on pie sales
)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


x1 (Price)
Interpreting the
Dummy Variable Coefficient
Example: Sales 300 - 30(Price)  15(Holiday)
Sales: number of pies sold per week
Price: pie price in $
1 If a holiday occurred during the week
Holiday:
0 If no holiday occurred

b2 = 15: on average, sales were 15 pies greater in


weeks with a holiday than in weeks without a
holiday, given the same price

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Multiple Regression Assumptions

Errors (residuals) from the regression model:

<
ei = (yi – yi)

Assumptions:
 The errors are normally distributed
 Errors have a constant variance
 The model errors are independent

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


Chapter Summary
 Developed the multiple regression model
 Tested the significance of the multiple regression model
 Discussed adjusted R2 ( R2 )
 Tested individual regression coefficients
 Tested portions of the regression model
 Used dummy variables
 Discussed using residual plots to check model
assumptions

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


9.e) n 21; K 2; SSE 120; SST 180
2 SSE / n  K  1 120 /18
adj r 1  1  0.741
SST / n  1 180 / 20
10.e) n 13; K 2; SSE 120; SST 150
2 SSE / (n  K  1) 120 /10
adj r 1  1  0.04
SST / (n  1) 150 /12
14.11

14.12
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-46

You might also like