Logistic Regression

Read Complete Research Material

LOGISTIC REGRESSION

Logistic regression



Logistic regression

II.  Binary Logistic Regression interpretation

            Under Table # 1 Model Summary we glimpse that the -2 Log Likelihood statistic is 289.966.  This statistic assesses how badly the form forecasts the conclusions -- the lesser the statistic the better the model.  Although SPSS does not give us this statistic for the form that had only the intercept. Adding the second variable decreased the -2 Log Likelihood statistic by 325.666 - 289.966= 35.7, the ?2 statistic.  The Cox & Snell R2 can be understood like R2 in a multiple regression, its value is .003.  The Nagelkerke R2 value is .005 which is close to 0 which shows a feeble connection amidst variables.

            The Cox-Snell R2 and Nagelkerke R2 are endeavours to supply a logistic analogy to R2 in OLS regression. The Nagelkerke assess adapts the Cox-Snell assess in order that it varies from 0 to 1, as does R2 in OLS.

            The null form -2 Log Likelihood is granted by -2 * ln(L0) where L0 is the likelihood of getting the facts if the independent variables had no result on the outcome.

            The full form -2 Log Likelihood is granted by -2 * ln(L) where L is the likelihood of getting the facts with all independent variables integrated in the model.

            The distinction of these two yields a Chi-Square statistic which is a assess of how well the independent variables sway the outcome or dependent variable.

            If the P-value for the general form fit statistic is less than the accepted 0.05 then there is clues that not less than one of the independent variables assists to the proposition of the outcome (Connolly, Liang, 2008).

           In Table # 2, The classification table overhead is a 2 x 2 table which tallies correct and incorrect approximates for the full form with the independents as well as the constant. The pillars are the two forecast values of the dependent, while the lines are the two discerned (actual) values of the dependent. In a flawless form, all situations will be on the diagonal and the general per hundred correct will be 100%. If the logistic form has homoscedasticity (not a logistic regression assumption), the per hundred correct will be roughly the identical for both rows (Arnold, Sarabia, 2008). Here it is not, with the form forecasting all but seven non-minority situations but forecasting only one few cases. While the general per hundred rightly forecast appears quite excels 76.8%, the investigator should note that blindly approximating the most common class (non-minority) for all situations would yield an even higher per hundred correct (78.1%), as documented above. This suggests few rank will not be differentiated on the cornerstone of learning, job know-how, job class, and gender for these data.

The Variables in the Equation yield displays us that the regression formula is:

           From Table # 3 the Wald statistic overhead and the corresponding implication grade check the implication of each of the covariate and dummy independents in the model. The ratio of the logistic coefficient B to its ...
Related Ads