Sie sind auf Seite 1von 20

- 27 2011-

"
"


significant

logistic Regression 161


Independence variables . binary


) (Ho :
Ho=bi=o

. i= 1,2.7 bi

(Significance = p- value)

113

....

= 0.05 :
.1 ) (Var7

) (Sig=0 ) (y

.2 ) (Var 1
) (y Sig =0.002

.3 ) (Var 2
Sig=0.014

.4 ) (Var5
Sig=0.037

) ) (Var4 )(Var6

) ( (Var3 = 0.05

114

- 27 2011-

-1-1:

-2-1 :

115

....

:
.
.

-3-1 :

) (Significant ) (dependent variable

)) (Binary (
.


logistic Regression
Binary )) (Bernolli (

) .. .( .

) ( parameters estimation
.

-4-1 :

116

- 27 2011-

-5-1 :
. 2009. .

-6-1 :
161

) (Multistage cluster sample



25

-7-1 :
binary

:y ) ( )=0 =1 (

Var1 :Q1 ) ( ) 15000 = 0


( > 15000 = 1
Var2 :Q2 ) (
)=0=1(

117

....

Var3 :Q3 ) ( )= 0 =1(


Var4 :Q4 ) (
) = 0 =1 (
Var5 :Q5 ) (
) = 0 =1 (
Var6 :Q6 ) (
) = 0 =1(
Var7 :Q7 ) (
) = 0 =1 (

) (10

-8-1 ):(Hypotheses study


:
: b1
.

H 0 : b1 = 0

: b2

H 0 : b2 = 0 .

118

Var7 ,Var6 ,Var5,Var4 ,Var3

- 27 2011-

-9-1 :

)
(

-2 ):(Logistic Regression
) (y
Bernolli ) (1 ) (p

) (0 ) q=(1-p

:

y = b0 + b1x + e

(y) ) (y

X ) E ( y e
e = y y
E ( y / ) = b0 + b 1 :
) ( ) ( + ) (y
:

E ( y / x ) = P (y = 1) = P

) (0,1


P
) (y 0 P 1 ) p
( ) (
q
1 p

119

....

) ( 0 0 P P
q
q
) ) ( loge ( P
q

p
log e ( ) = b0 + b :
q
k
p
Xij :
loge ( ) = b0 + bbj j Xij
q
i =1

i = 1,2, n :

j = 1, 2 k

1
:
] 1 + exp[ ( B 0 + bj Xij

= P

exp : .

p
)
q

p
q

( loge ) ( ln

Logit transformation ) (0-1

) (y ) ( ) (y
) (
) ( p ) (Odds of success
q
p
) ( ) (odds of failure
q

p
)
1 p

( loge Log Odds Ratio ).(Logit

) (logistic distribution
).(0-1
)(maximum likelihood

120

- 27 2011-

) (M.L n ) ( P1 , P2 ..........Pn

) M .L = prob( P1 , P2 ..........Pn

-3 :
SPSS Enter
Analyze Regression Binary

logistic ok )(1
.

Case Processing Summary

)(1
Percent

100.0

161

.0
100.0
.0
100.0

0
161
0
161

)Unweighted Cases(a
Included in Analysis
Selected Cases

Missing Cases
Total
Unselected Cases
Total

a If weight is in effect, see classification table for the total number of cases.

) (1
Missing data ) (2 code .
Dependent Variable Encoding

)(2
Internal Value

Original Value

121

....

) (3

.

)Iteration History(a,b,c,d

)(3
Coefficients
Q7

Q6

Q5

Q4

Q3

Q2

Q1

Const
ant

-2 Log
likelihoo
d

Iteration

-1.543

.118

-.988

.153

-.132

.497

-1.443

5.014

68.181

-2.448

.173

-1.547

.374

-.339

1.111

-2.254

7.447

41.578

-3.376

.102

-1.967

.679

-.623

1.962

-3.046

9.558

31.391

-4.332

-.119

-2.242

1.005

-.902

3.015

-3.889

11.505

27.400

-5.134

-.363

-2.371

1.277

-1.075

3.998

-4.678

12.981

26.237

-5.519

-.483

-2.407

1.405

-1.106

4.507

-5.100

13.612

26.092

-5.583

-.502

-2.410

1.424

-1.100

4.599

-5.177

13.702

26.089

-5.585

-.502

-2.410

1.425

-1.099

4.601

-5.179

13.704

26.089

-5.585

-.502

-2.410

1.425

-1.099

4.601

-5.179

13.704

26.089

Step
1

a Method: Enter
b Constant is included in the model.
c Initial -2 Log Likelihood: 213.652
d Estimation terminated at iteration number 9 because parameter estimates changed by
less than. 001.

) (26.089 ) likelihood = 26.089

( 2 log

) ( P1 , P2 ..........P7 0.001
)(3

: 6 , 7 , 8 ,9

122

- 27 2011-

) (4 ) (3
) (4 ) (constant,b1,. b7

Variables in the Equation

)(4
95.0% C.I.for
)EXP(B
Upper

)Exp(B

Sig.

df

Wald

S.E.

Lower

.147

.000

.006

.002

9.676

1.665

-5.179

Q1

3893.270

2.548

99.605

.014

6.052

1.870

4.601

Q2

5.364

.021

.333

.438

.601

1.418

-1.099

Q3

61.320

.282

4.156

.300

1.076

1.373

1.425

Q4

.867

.009

.090

.037

4.339

1.157

-2.410

Q5

8.000

.046

.605

.703

.080

.000

.004

.000

.145
12.84
3

1.317

-.502

Q6

1.558

-5.585

Q7

894364.
640

.004

8.266

4.766

13.704

Const
ant

Step
)1(a

a Variable(s) entered on step 1: Q1, Q2, Q3, Q4, Q5, Q6, Q7.

) (wald

. ) (Goodness of fit
2

F R
2

) (log likelihood Ratio ) (Chi Square


:

] 2 = 2[loge L0 loge L1
: L1 ) (i
: L0 ) (i-1

123

....

= 187.563

0.001 ) (5

Sig=0 ) (5 d.f=7

).(Enter
Omnibus Tests of Model Coefficients

)(5
Sig.

df

Chi-square

.000

187.563

Model

) (6
) ( 2 ) (observed ) (Expected

Hosmer and lemeshow 2


) (observed )(Expected

contingency Table

)(6
Contingency Table for Hosmer and Lemeshow Test
)(6

Total

Expected

Observed

Expected

Observed

15

.003

14.997

15

15

.041

14.959

15

17

.430

16.570

16

16

2.811

13.189

14

17

15.891

16

1.109

16

15.865

16

.135

16

15.963

16

.037

19

18.998

19

.002

18

17.999

18

.001

12

12.000

12

.000

10

124

Ste
p1

- 27 2011-

) (y
) (y . )(H
) ( 2 observed

Expected ] [0 , 1

m=10

) Yi p ( Xi K :

] Jk = [ i : (k 1) / m P ( X i ) K / m
k:

) h1k = p ( Xi

h1k = Yi

] )h2 k = [1 p ( Xi

] h2 k = [1 Yi

i jk

i = jk

i = jk

i = jk

:Null Hypothess

H 0 = h1k = h1k
h = h

2k

2k

) (H :
2

H = (hsj hsj) 2 / hSj


S =1 j =1

) (H ) ( d.F=m-2

) (7 H Statistic = 1.292
sig = 0.996 d.F=8

) (6 ) (k )) (0-1 (

125

....

Hosmer and Lemeshow Test

)(7
Sig.

df

Chi-square

Step

.996

1.292

) Classification Table (8

overall Percentage=97.5 % ) 98+59)/161=0.975

) (4 2.5 %
.

)Classification Table(a

)(8
Observed

Predicted

Percentage
Correct

96.7

59

98.0

98

97.5

Step 1

Overall Percentage

a The cut value is. 500

-4 :
) (4 ) (B
Log-odds :

p
) = 13.704 5.179 1 + 4.601 2 1.099 3 + 1.425 4 2.410 5 0.502 6 5.585 7
1 p

126

(log

- 27 2011-

p ) (
) (logit

S.E S.E ( B i ) = hii


hii ) (Diag :

) Cov ( B ) = X Diag (ni p i (1 Pi ) X

wald
2

bi
wald =

S.E (bi )

2 d.F=1 )(4

5.179 2
] = 9.676
1.665

[=

Wald for var1

H0: bi=0 H 1 : bi 0

Sig

= 0.05 Sig < 0.05 H0


)Exp(B

odds Ratio

] ) p(y ]) [1-p(y var1


Exp ( 5.179) = e 5.179 = 0.006 = Odds Ratio


) C.I for Exp(B
) (4 :
) " (Var7 "

) (Y b7=-5.585

Significance < 0.001 d.F=1

127

....

wald statistic =12.843


][1.57-1870

) (Var1 ) (Y
b1=-5.179 ) (X

) 1500( >15000
) (5.179
Sig=0.002 .wald=9.676

) (Var2 " code=1 " code=2


b2=4.061 ) (X

4.061
Sig=0.014 .wald=6.052

) " (Var5 code=1


" code=2 Y ) (X code=1 code=2

b5=-2.410 ) (Y
sig= 0.037 Y .wald=4.339

) = var4 = var6
= (Var3 Y Sig=0.3

Sig=0.703 Sig=0.4156 (

128

- 27 2011-

)(
.

:
-1
.

-2 ) (
) (15000

.

-3 ) (


.

:
:
-1
.

-2
.
-3

.

129

....

-4

.

130

- 27 2011-

-1 ) :(1999
.

-2 )" (1998 "


.
-3 ) (
.

-4 . . . ) :(1983

.

-5 ) (1990
)(

131

....

1- Brown , C.E (1998): Applied Multivariate , statistics in Geohydrology and related


sciences , Springer verlag. Berlin Heidelberg , chapter 6 , multiple regression. pp. 6266
2- Cox , D.R. (1966) Some procedures associated with the logistic qualitative response
curves. In research paper in Statistics. (Birkbeck college. University of London) P.P. 5571
3- Draper,N.R.and smith , H.(1981): Applied Regression anlysis , new York , P.413
4- Hosmer , D.W , Lemeshow , S. and Klor , J.(1988) Goodness of fit testing for the
logistic model when the estimated probabilities are small.
5- kleinbaum , D.G, Kupper , L.L. and muller. k. E. (1988): Applied Regression
Analysis and other Multibariable methods DWS- KENT publishing company , a division
of wadsworth P 317
6- Koutsoyiannis , A., (1977): Theory of Econometrics. , An introduction Exposition of
Econometric Methods p129..

.2009/9/10

132

Das könnte Ihnen auch gefallen