Unified Estimation Framework for Multilevel Logistic Regression: Integrating Maximum Likelihood, Entropy-Based Regularization, and Bayesian MAP via Newton-Raphson Optimization

  • Ekhlas Al-Ameri Department of Statistics, College of Administration and Economics, University of Kerbala,Iraq
  • Mushtaq K. Abdalrahem Department of Statistics, College of Administration and Economics, University of Kerbala,Iraq; College of Pharmacy, University of Al-Ameed, Iraq
  • Enas A. Mohammed Department of Statistics, College of Administration and Economics, University of Kerbala,Iraq
Keywords: multilevel logistic regression; entropy regularization; Newton–Raphson; MAP estimation; separation robustness

Abstract

Multilevel logistic regression is foundational for the analysis of nested binary data but practitioners face a trade-off: maximum-likelihood estimators are computationally efficient yet vulnerable to separation and rare-event instability, whereas fully Bayesian sampling confers stability at substantial computational cost. To reconcile these competing demands, we introduce a unified estimation framework that reframes Bayesian MAP estimation with an entropy-based prior as a regularized optimization problem, thereby integrating data fidelity (maximum likelihood), entropy-based regularization that penalizes overly certain or implausible parameter configurations, and a probabilistic MAP interpretation within a single objective. We derive the joint penalized marginal log-likelihood under a Laplace approximation for random effects, provide explicit expressions for gradient and Hessian contributions from both likelihood and entropy terms, and implement a Newton–Raphson optimizer that exploits the block-diagonal Hessian structure typical of hierarchical models and incorporates numerical safeguards (ridge augmentation, line-search) for stability. Theoretical analysis establishes consistency and asymptotic normality of the estimator under regularity conditions and clarifies the dual frequentist–Bayesian interpretation of the entropy penalty. Extensive simulation studies, designed to probe separation, rare events, unbalanced cluster sizes, and a broad range of random-effect variances, demonstrate that the unified estimator substantially reduces non-convergence and extreme coefficient estimates relative to unregularized ML, attains point-estimation accuracy and interval calibration comparable to MCMC-based Bayesian methods in representative settings, and requires far less computational effort than full posterior sampling—making empirical-Bayes selection of the regularization weight practically feasible. The proposed approach thus provides a computationally efficient and theoretically principled alternative for estimating multilevel logistic models, bridging the gap between frequentist and Bayesian paradigms while preserving interpretability and reproducibility.
Published
2026-03-27
How to Cite
Ekhlas Al-Ameri, Mushtaq K. Abdalrahem, & Enas A. Mohammed. (2026). Unified Estimation Framework for Multilevel Logistic Regression: Integrating Maximum Likelihood, Entropy-Based Regularization, and Bayesian MAP via Newton-Raphson Optimization. Statistics, Optimization & Information Computing. https://doi.org/10.19139/soic-2310-5070-3225
Section
Research Articles