# Download e-book for iPad: Applied Logistic Regression (Wiley Series in Probability and by David W. Hosmer, Stanley Lemeshow

By David W. Hosmer, Stanley Lemeshow

ISBN-10: 0471615536

ISBN-13: 9780471615538

This e-book discusses tips to version a binary end result variable from a linear regression research standpoint. It develops the logistic regression version and describes its use in equipment for modelling the connection among a dichotomous end result variable and a suite of covariates. dialogue of the translation of this version follows. The textual content comprises numerous information units that are the resource of the examples and routines. The booklet additionally makes use of a few software program applications, together with BMDP, EGRET, GLIM, SAS, and SYSTAT, to investigate information units.

Read or Download Applied Logistic Regression (Wiley Series in Probability and Mathematical Statistics. Applied Probability and Statistics Section) PDF

Best probability & statistics books

Download PDF by E.L. Lehmann: Elements of Large-Sample Theory

Parts of huge pattern conception offers a unified remedy of first-order large-sample idea. It discusses a huge variety of functions together with introductions to density estimation, the bootstrap, and the asymptotics of survey technique written at an hassle-free point. The booklet is appropriate for college students on the Master's point in records and in aplied fields who've a heritage of 2 years of calculus.

New PDF release: An Introduction to Categorical Data Analysis, Second Edition

The 1st version of this article has bought over 19,600 copies. even though, using statistical tools for specific information has elevated dramatically in recent times, relatively for functions within the biomedical and social sciences. A moment version of the introductory model of the publication will swimsuit it properly.

Extra resources for Applied Logistic Regression (Wiley Series in Probability and Mathematical Statistics. Applied Probability and Statistics Section)

Sample text

When overlap is at a single or a few tied values the configuration was termed by Albert and Anderson as quasicomplete separation. As the value of x takes on values greater than 6 the overlap becomes greater and the estimated parameters and standard errors begin to attain more reasonable values. The sensitivity of the fit to the overlap will of course depend on the sample size and the range of the covariate. The tip-off that something is amiss is, as in the case of the zero cell count, the very large estimated coefficients and especially the large estimated standard errors.

For the model containing x3 we see that the estimated coefficients are of reasonable magnitude but the estimated standard errors are much larger than we would expect. The 36 37 model containing all variables is a composite of the results of the other models. In all cases the tip-off for a problem comes from the aberrantly large estimated standard errors. In a more complicated data set an analysis of the associations among the covariates using a colinearity analysis similar to that performed in linear regression should be helpful in identifying the dependencies among the covariates [see Belsley, Kuh, and Welsch (1980)].

The net results of this is that the maximum likelihood estimates do not exist [Albert and Anderson (1984) and Santner and Duffy (1986)]. In order to have finite maximum likelihood estimates we must have some overlap in the distribution of the covariates in the model. A simple example will illustrate the problem of complete separation and the results of fitting logistic regression models to such data. 0, y = 0), (6,1), (7,1), (8,1), (9,1), (10,1), (11,1). 23. 5 we have complete separation and all estimated parameters are huge, since the maximum likelihood estimates do not exist.