Comparison of Subspace Dimension Reduction Methods in Logistic Regression

  • Saeed Heydari Persian Gulf University
  • Mahmoud Afshari Persian Gulf University
  • Saeed Tahmasbi Persian Gulf University
  • Morad Alizadeh Persian Gulf University
Keywords: Dimension reduction‎, ‎Likelihood acquired direction‎, ‎Sliced average variance estimation‎, ‎Sliced inverse regression.

Abstract

Regression models are very useful in describing and predicting real world phenomena. The Logistic regression is an extremely robust and flexible method for dichotomous classification prediction. This model is a classification model rather than regression model. When the number of predictors in regression models is high, data analysis is difficult. Dimension reduction has become one of the most important issues in regression analysis because of its importance in dealing with problems with high-dimensional data. In this paper, the methods of diminishing the dimension of variables in logistic regression, which include the estimation of central subspace based on the inverse regression, the likelihood acquisition method and principal component analysis are considered. Using a real data associated with the dental problems the Logistic regression is fitted and the correct classification of the data computed. At the end, The simulation study is presented to compare the sufficient dimension reduction methods with each other. In the simulation, MATLAB software is used and the Programs are attached at the end of the article in appendix.

References

Afshari, M. (2017) Nonlinear wavelet shrinkage estimator of nonparametric regularity regression function via cross- validation with simulation study, International Journal of Wavelets, Multiresolution and Information Processing,15, 1-16.

Afshari, M., Lak, F. and Gholizadeh, B. (2017) A new Bayesian wavelet thresholding estimator of nonparametric regression, J. Appl. Stat, 44, 649-666.

Amato, U., Antoniadis, A. and Feis, I. (2006) Dimension Reduction in Functional Regression with Applications, Computational Statistics & Data Analysis, 50, 2422-2446.

Bremel, R.D., Homan, E.J. (2010) An integrated approach to epitope analysis I: Dimensional reduction, visualization and prediction of MHC binding using amino acid principal components and regression approaches, Immunome Res 6, 7.

Bura, E., and Cook, R. D. (2001), Extending Sliced inverse regression: The weighted Chi-squared test, Journal of the American Statistical Association 996-103.

Cook, R. D. (1994), Using dimension-reduction subspaces to identify important inputs in models of physical systems, Proceedings of the Section on Physical and Engineering Sciences Journal of American Statistical Association,18-25.

Cook, R. D. (1998), Regression Graphics, John Wiley and Sons, New York.

Cook, R. D. and Forzani, L. (2009), Likelihood based sufficient dimension reduction, Journal of American Statistical Association 104, 197-20.

Cook, R. D. and Weisberg, S. (1991), Discussion of Sliced inverse regression by K. C. Li, Journal of the American Statistical Association,86, 316-342.

Diaconis, P. and Freedman, D. (1984), Asymptotics of graphical projection pursuit, The Annals of Statistics 12, 793C815.

Duan, N., and Li, K. C. (1991), Slicing regression: A link-free regression method, The Annals of Statistics,19(2), 505 530.

Fukunaga, F. (1990), Introduction to Statistical Pattern Recognition, Academic Press Professional, Inc., San Diego, CA, USA.

Hsing, T. and Carroll, R. J. (1992), An asymptotic theory for sliced inverse regression, The Annals of Statistics,20(2), 1040-1061.

Jimenez, L. O. and Landgrebe, D. A. (1997), Supervised classification in high-dimensional space: geometrical, statistical, and asymptotical properties of multivariate data, IEEE Transactions on Systems, Man and Cybernetics,28, 39-54.

Joliffe, I. T. (1986) Component Analysis, Springer-Verlag, New York, USA.

Li, K. C. (1991), Sliced inverse regression for dimension reduction (with discussion), Journal of the American Statistical Association,86, 316-342.

Li, K. C. (1992), On principal Hessian directions for data visualization and dimension reduction: Another application of Steins lemma, Journal of American Statistical Association,87, 1025-1039.

Li, K. C., Wang, J. L., and Chen, C. H. (1999), Dimension reduction for censored regression data, The Annals of Statistics,27, 1-23.

Li, K. C., Aragon, Y., Shedden, K., Agnan C. T. (2003), Dimension reduction for multivariate response data, Journal of American Statistical Association,98,9-109.

Li, B. and Wang, S. (2007), On directional regression for dimension reduction, Journal of American Statistical Association,102, 997-1008.

Li, L. and Yin, X. (2008), Sliced inverse regression with regularizations Biometrics,64, 124-131.

Lue, H. H. (2008), Sliced average variance estimation for censored data, Communications in Statistics-Theory and Methods,37, 3276-3286.

Luo, W., Li, B. and Yin, X. (2014), On efficient dimension reduction with respect to a statistical functional of interest, The Annals of Statistics,42(1), 382-412.

Ye, Z. and Weiss, R. (2003), Using the bootstrap to select one of a new class of dimension reduction methods, Journal of the American Statistical Association,98, 968-979.

Yin, X. and Cook, R. D. (2005), Direction estimation in single-index regressions, Biometrika .92(2), 371-384.

Yin, X., Li, B. and Cook, R.D. (2008), Successive direction extraction for estimating the central subspace in a multiple index regression, Journal of Multivariate Analysis,99, 1733C1757.

Yu, Z., Dong, Y., and Huang, M. (2014), General directional regression, Journal of Multivariate Analysis,124, 94-104.

Zhu, L. P. and Zhu, L. X. (2007), On kernel method for sliced average variance estimation, of Multivariate Analysis,98, 970-991.

Zhu, L. X., Ohtaki, M. and Li, Y. X. (2007), On hybrid methods of inverse regression based algorithms, Computational Statistics and Data Analysis,51, 2621-2635.

Zhu, L. P., Zhu, L. X. and Feng, Z. H. (2010), Dimension reduction in regressions through cumulative slicing estimation, Journal of the American Statistical Association,105, 1455-1466.

Published
2021-09-23
How to Cite
Heydari, S., Afshari, M., Tahmasbi, S., & Alizadeh, M. (2021). Comparison of Subspace Dimension Reduction Methods in Logistic Regression. Statistics, Optimization & Information Computing, 11(2), 422-444. https://doi.org/10.19139/soic-2310-5070-1303
Section
Research Articles