Concepedia

Publication | Closed Access

A solution to the problem of separation in logistic regression

2K

Citations

30

References

2002

Year

TLDR

Separation, or monotone likelihood, arises in logistic regression when the likelihood converges while at least one parameter estimate diverges, typically in small samples with unbalanced, highly predictive risk factors. The study demonstrates that Firth’s bias‑reduction procedure offers an ideal solution to separation in logistic regression. It employs penalized maximum likelihood estimation to yield finite parameter estimates, with Wald tests and confidence intervals available, though penalized likelihood ratio tests and profile penalized likelihood confidence intervals are often preferable. Analysis of two cancer studies demonstrates the clear advantage of this procedure over previous methods.

Abstract

The phenomenon of separation or monotone likelihood is observed in the fitting process of a logistic model if the likelihood converges while at least one parameter estimate diverges to +/- infinity. Separation primarily occurs in small samples with several unbalanced and highly predictive risk factors. A procedure by Firth originally developed to reduce the bias of maximum likelihood estimates is shown to provide an ideal solution to separation. It produces finite parameter estimates by means of penalized maximum likelihood estimation. Corresponding Wald tests and confidence intervals are available but it is shown that penalized likelihood ratio tests and profile penalized likelihood confidence intervals are often preferable. The clear advantage of the procedure over previous options of analysis is impressively demonstrated by the statistical analysis of two cancer studies.

References

YearCitations

Page 1