Unified Estimation Framework for Multilevel Logistic Regression: Integrating Maximum Likelihood, Entropy-Based Regularization, and Bayesian MAP via Newton-Raphson Optimization
Keywords:
multilevel logistic regression; entropy regularization; Newton–Raphson; MAP estimation; separation robustness
Abstract
Multilevel logistic regression is foundational for the analysis of nested binary data but practitioners face a trade-off: maximum-likelihood estimators are computationally efficient yet vulnerable to separation and rare-event instability, whereas fully Bayesian sampling confers stability at substantial computational cost. To reconcile these competing demands, we introduce a unified estimation framework that reframes Bayesian MAP estimation with an entropy-based prior as a regularized optimization problem, thereby integrating data fidelity (maximum likelihood), entropy-based regularization that penalizes overly certain or implausible parameter configurations, and a probabilistic MAP interpretation within a single objective. We derive the joint penalized marginal log-likelihood under a Laplace approximation for random effects, provide explicit expressions for gradient and Hessian contributions from both likelihood and entropy terms, and implement a Newton–Raphson optimizer that exploits the block-diagonal Hessian structure typical of hierarchical models and incorporates numerical safeguards (ridge augmentation, line-search) for stability. Theoretical analysis establishes consistency and asymptotic normality of the estimator under regularity conditions and clarifies the dual frequentist–Bayesian interpretation of the entropy penalty. Extensive simulation studies, designed to probe separation, rare events, unbalanced cluster sizes, and a broad range of random-effect variances, demonstrate that the unified estimator substantially reduces non-convergence and extreme coefficient estimates relative to unregularized ML, attains point-estimation accuracy and interval calibration comparable to MCMC-based Bayesian methods in representative settings, and requires far less computational effort than full posterior sampling—making empirical-Bayes selection of the regularization weight practically feasible. The proposed approach thus provides a computationally efficient and theoretically principled alternative for estimating multilevel logistic models, bridging the gap between frequentist and Bayesian paradigms while preserving interpretability and reproducibility.
Published
2026-03-27
How to Cite
Ekhlas Al-Ameri, Mushtaq K. Abdalrahem, & Enas A. Mohammed. (2026). Unified Estimation Framework for Multilevel Logistic Regression: Integrating Maximum Likelihood, Entropy-Based Regularization, and Bayesian MAP via Newton-Raphson Optimization. Statistics, Optimization & Information Computing. https://doi.org/10.19139/soic-2310-5070-3225
Issue
Section
Research Articles
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).