regression2
regression2
(STAT 367)
COLLECGE OF SCIENCE
E. O. Owiredu
Yi = β0 + β1 Xi + εi , i = 1...n (1)
Variables:
Y - Dependent Variable
X - Independent Variable
Parameter:
β0 - Intercept
β1 - Slope
ε - Random error component
n
X n
X
L = min ε̂2 = (Y − Ŷ )2 (3)
i=1 i=1
n n
(Y − βˆ0 − β̂1 Xi )2
X X
L= ε̂2 = (4)
i=1 i=1
n
δL
= −2 (Y − βˆ0 − βˆ1 Xi )
X
(5)
δβ0 i=1
n
δL
Xi (Y − βˆ0 − β̂1 Xi )
X
= −2 (6)
δβ1 i=1
These normal equation are solved to find the estimated value β0 and β1
From eqn 5
n
(Y − βˆ0 − β̂1 Xi ) = 0
X
(7)
i=1
n n
Y − nβˆ0 − βˆ1
X X
Xi (8)
i=1 i=1
From eqn 10
n
Xi (Y − βˆ0 − β̂1 Xi ) = 0
X
(13)
i=1
From eqn 10
n
X
Xi (Y − [Ȳ − β̂1 X̄ ] − β̂1 Xi ) = 0 (14)
i=1
n
Xi (Y − Ȳ ) + βˆ1 [x̄ − Xi ] = 0
X
(15)
i=1
n n
βˆ1 [Xi − x̄ ]Xi = 0
X X
(Yi − Ŷ )Xi − (16)
i=1 i=1
n n
βˆ1 [Xi − x̄ ]Xi = 0
X X
(Yi − Ŷ )Xi = (17)
i=1 i=1
From eqn 16
n
[Yi − Ŷ ][Xi − x̄ ]
P
βˆ1 = Pi=1
n (18)
i=1 [Xi − x̄ ][Xi − x̄ ]
Pn
i=1 [Yi − Ŷ ][Xi − x̄ ]
βˆ1 = Pn 2
(19)
i=1 [Xi − X̄ ]
Sxy
βˆ1 = (20)
SSxx
cov (y , x )
βˆ1 = (21)
var (x )
Pn Pn Pn
n i=1 XY − ( i=1 X )( i=1 Y )
βˆ1 = (22)
n( ni=1 Xi2 ) − ( ni=1 X )2
P P
Ŷ = 4.692 +0.923X
Ŷ = 4.692 + 0.923(8.5)
Ŷ = 12.538
εi = Yi − Ŷ
is used to obtain the estimate of the error term. The sum of squares of the
residuals (error sum of squares) is;
n
X n
X
SSRes = ε2i = (Yi − Ŷ )2
i=1 i=1
E (SSE ) = (n − 2)σ 2
βˆ1 − β1
t= (25)
Sβˆ1