Hybrid GA–DeepAutoencoder–KNN Model for Employee Turnover Prediction

  • CHIN SIANG LIM
  • ESRAA FAISAL MALIK
  • KHAI WAH KHAW
  • ALHAMZAH ALNOOR
  • XINYING CHEW Universiti Sains Malaysia, 11800 USM, Pulau Pinang, Malaysia.
  • ZHI LIN CHONG
  • Mariam Al Akasheh
Keywords: Autoencoder, Employee turnover, GA-DeepAutoencoder-KNN, Genetic algorithm, Hybrid machine learning architecture, KNN, Turnover prediction

Abstract

Organizations strive to retain their top talent and maintain workforce stability by predicting employee turnover and implementing preventive measures. Employee turnover prediction is a critical task, and accurate prediction models can help organizations take proactive measures to retain employees and reduce turnover rates. Therefore, in this study, we propose a hybrid genetic algorithm–autoencoder–k-nearest neighbor (GA–DeepAutoencoder–KNN) model to predict employee turnover. The proposed model combines a genetic algorithm, an autoencoder, and the KNN model to enhance prediction accuracy. The proposed model was evaluated and compared experimentally with the conventional DeepAutoencoder–KNN and k-nearest neighbor models. The results demonstrate that the GA–DeepAutoencoder–KNN model achieved a significantly higher accuracy score (90.95\%) compared to the conventional models (86.48% and 88.37% accuracy, respectively).  Our findings are expected to assist HR teams identify at-risk employees and implement targeted retention strategies to improve the retention rate of valuable employees. The proposed model can be applied to various industries and organizations, making it a valuable tool for HR professionals to improve workforce stability and productivity.

References

[1] N. Dhanpat, F. D. Modau, P. Lugisani, R. Mabojane, and M. Phiri, “Exploring employee retention and intention to leave within a call centre,” SA J. Hum. Resour. Manag., vol. 16, no. 0, p. 13, Mar. 2018, doi: 10.4102/SAJHRM.V16I0.905.
[2] S. N. Khera and Divya, “Predictive Modelling of Employee Turnover in Indian IT Industry Using Machine Learning Techniques,” Vision, vol. 23, no. 1, pp. 12–21, 2019, doi: 10.1177/0972262918821221.
[3] M. H. Rahman, M. Al-Amin, M. A. Salam, T. Saha, and T. Dey, “Addressing Voluntary Turnover in Manufacturing Sectors: An Empirical Study,” Int. Fellowsh. J. Interdiscip. Res., vol. 1, no. 1, Jan. 2021, doi: 10.5281/ZENODO.4468529.
[4] S. H. An, “Employee Voluntary and Involuntary Turnover and Organizational Performance: Revisiting the Hypothesis from Classical Public Administration,” https://doi.org/10.1080/10967494.2018.1549629, vol. 22, no. 3, pp. 444–469, May 2019, doi: 10.1080/10967494.2018.1549629.
[5] J. L. Cotton and J. M. Tuttle, “Employee Turnover: A Meta-Analysis and Review with Implications for Research,” Acad. Manag. Rev., vol. 11, no. 1, pp. 55–70, Jan. 1986, doi: 10.5465/AMR.1986.4282625.
[6] Z. He, L. Chen, and Z. Shafait, “How psychological contract violation impacts turnover intentions of knowledge workers? The moderating effect of job embeddedness,” Heliyon, vol. 9, no. 3, p. e14409, Mar. 2023, doi: 10.1016/J.HELIYON.2023.E14409.
[7] F. K. Alsheref, I. E. Fattoh, and W. Mead, “Automated Prediction of Employee Attrition Using Ensemble Model Based on Machine Learning Algorithms,” Comput. Intell. Neurosci., vol. 2022, 2022, doi: 10.1155/2022/7728668.
[8] iGrad, “The cost of replacing an employee and the role of financial wellness,” 2022. https://www.enrich.org/blog/The-true-cost-of-employee-turnover-financial-wellness-enrich (accessed Mar. 31, 2023).
[9] A. Frye, C. Boomhower, M. Smith, L. Vitovsky, and S. Fabricant, “Employee Attrition: What Makes an Employee Quit?,” SMU Data Sci. Rev., vol. 1, no. 1, 2018.
[10] C. P. Maertz and M. A. Campion, “25 Years of Voluntary Turnover Research: A Review and Critique,” Manag. J., pp. 1036–1054, 1998.
[11] Y. Zhao, M. K. Hryniewicki, F. Cheng, B. Fu, and X. Zhu, “Employee turnover prediction with machine learning: A reliable approach,” in Intelligent Systems and Applications, 2018, vol. 869, pp. 737–758, doi: 10.1007/978-3-030-01057-7_56/COVER.
[12] H. Gunduz, “An efficient dimensionality reduction method using filter-based feature selection and variational autoencoders on Parkinson’s disease classification,” Biomed. Signal Process. Control, vol. 66, p. 102452, Apr. 2021, doi: 10.1016/J.BSPC.2021.102452.
[13] Q. Meng, H. Zhu, K. Xiao, L. Zhang, and H. Xiong, “A hierarchical career-path-aware neural network for job mobility prediction,” Proc. ACM SIGKDD Int. Conf. Knowl. Discov. Data Min., pp. 14–24, Jul. 2019, doi: 10.1145/3292500.3330969.
[14] D. K. Srivastava and P. Nair, “Employee attrition analysis using predictive techniques,” in Information and Communication Technology for Intelligent Systems (ICTIS 2017), 2018, vol. 83, pp. 293–300, doi: 10.1007/978-3-319-63673-3_35/COVER.
[15] S. Al-Darraji, D. G. Honi, F. Fallucchi, A. I. Abdulsada, R. Giuliano, and H. A. Abdulmalik, “Employee Attrition Prediction Using Deep Neural Networks,” Comput. 2021, Vol. 10, Page 141, vol. 10, no. 11, p. 141, Nov. 2021, doi: 10.3390/COMPUTERS10110141.
[16] M. Teng, H. Zhu, C. Liu, and H. Xiong, “Exploiting Network Fusion for Organizational Turnover Prediction,” ACM Trans. Manag. Inf. Syst., vol. 12, no. 2, Jun. 2021, doi: 10.1145/3439770.
[17] H. Moeini and F. M. Torab, “Comparing compositional multivariate outliers with autoencoder networks in anomaly detection at Hamich exploration area, east of Iran,” J. Geochemical Explor., vol. 180, pp. 15–23, Sep. 2017, doi: 10.1016/J.GEXPLO.2017.05.008.
[18] Y. Lecun, Y. Bengio, and G. Hinton, “Deep learning,” Nat. , vol. 521, no. 7553, pp. 436–444, May 2015, doi: 10.1038/nature14539.
[19] K. Adem, “Diagnosis of breast cancer with Stacked autoencoder and Subspace kNN,” Phys. A Stat. Mech. its Appl., vol. 551, p. 124591, Aug. 2020, doi: 10.1016/J.PHYSA.2020.124591.
[20] Y. Ju and W. Sun, “Plasma assisted combustion: Dynamics and chemistry,” Prog. Energy Combust. Sci., vol. 48, pp. 21–83, Jun. 2015, doi: 10.1016/J.PECS.2014.12.002.
[21] P. Zhu, X. Zhan, and W. Qiu, “Efficient k-Nearest Neighbors Search in High Dimensions Using MapReduce,” Proc. - 2015 IEEE 5th Int. Conf. Big Data Cloud Comput. BDCloud 2015, pp. 23–30, Oct. 2015, doi: 10.1109/BDCLOUD.2015.51.
[22] R. Punnoose and P. Ajit, “Prediction of Employee Turnover in Organizations using Machine Learning Algorithms,” Int. J. Adv. Res. Artif. Intell., vol. 5, no. 9, 2016, doi: 10.14569/IJARAI.2016.050904.
[23] N. Ukey, Z. Yang, B. Li, G. Zhang, Y. Hu, and W. Zhang, “Survey on Exact kNN Queries over High-Dimensional Data Space,” Sensors, vol. 23, no. 2, Jan. 2023, doi: 10.3390/S23020629.
[24] A. Atla, R. Tada, V. Sheng, and N. Singireddy, “Sensitivity of different machine learning algorithms to noise,” J. Comput. Sci. Coll., vol. 26, no. 5, pp. 96–103, May 2011, doi: 10.5555/1961574.1961594.
[25] J. Zhao, Y. Kim, K. Zhang, A. M. Rush, and Y. Lecun, “Adversarially Regularized Autoencoders,” in International conference on machine learning, Jul. 2018, pp. 5902–5911.
[26] A. Gudigar et al., “FFCAEs: An efficient feature fusion framework using cascaded autoencoders for the identification of gliomas,” Int. J. Imaging Syst. Technol., vol. 33, no. 2, pp. 483–494, Mar. 2023, doi: 10.1002/IMA.22820.
[27] J. Zhai, S. Zhang, J. Chen, and Q. He, “Autoencoder and Its Various Variants,” in IEEE International Conference on Systems, Man, and Cybernetics, SMC 2018, Jan. 2018, pp. 415–419, doi: 10.1109/SMC.2018.00080.
[28] Y. N. Kunang, S. Nurmaini, D. Stiawan, A. Zarkasi, and F. Jasmir, “Automatic Features Extraction Using Autoencoder in Intrusion Detection System,” in Proceedings of 2018 International Conference on Electrical Engineering and Computer Science, ICECOS 2018, Jan. 2019, pp. 219–224, doi: 10.1109/ICECOS.2018.8605181.
[29] S. Katoch, S. S. Chauhan, and V. Kumar, “A review on genetic algorithm: past, present, and future,” Multimed. Tools Appl., vol. 80, no. 5, pp. 8091–8126, Feb. 2021, doi: 10.1007/S11042-020-10139-6/FIGURES/8.
[30] G. Jones, “Genetic and Evolutionary Algorithms,” Encycl. Comput. Chem., vol. 2, no. 40, pp. 1127–1136, 1998.
[31] P. A. Diaz-Gomez and D. F. Hougen, “Initial Population for Genetic Algorithms: A Metric Approach,” Gem , pp. 43–49, 2007.
[32] A. Ng, “CS294A Lecture notes Sparse autoencoder,” vol. 72. pp. 1–19, 2011.
[33] C. F. Rodríguez-Hernández, M. Musso, E. Kyndt, and E. Cascallar, “Artificial neural networks in academic performance prediction: Systematic implementation and predictor evaluation,” Comput. Educ. Artif. Intell., vol. 2, Jan. 2021, doi: 10.1016/J.CAEAI.2021.100018.
[34] F. Assunção, D. Sereno, N. Lourenço, P. MacHado, and B. Ribeiro, “Automatic Evolution of AutoEncoders for Compressed Representations,” 2018 IEEE Congr. Evol. Comput. CEC 2018 - Proc., Sep. 2018, doi: 10.1109/CEC.2018.8477874.
[35] S. Mirjalili, Evolutionary Algorithms and Neural Networks, vol. 780. Cham: Springer International Publishing, 2019.
[36] W. Duch, T. Wieczorek, J. Biesiada, and M. Blachnik, “Comparison of feature ranking methods based on information entropy,” in IEEE International Conference on Neural Networks , 2004, vol. 2, pp. 1415–1419, doi: 10.1109/IJCNN.2004.1380157.
[37] M. Srinivas and L. M. Patnaik, “Genetic Algorithms: A Survey,” Computer (Long. Beach. Calif)., vol. 27, no. 6, pp. 17–26, 1994, doi: 10.1109/2.294849.
[38] PAVANSUBHASH, “IBM HR Analytics Employee Attrition & Performance,” 2017. https://www.kaggle.com/datasets/pavansubhasht/ibm-hr-analytics-attrition-dataset (accessed Dec. 25, 2022).
Published
2023-10-27
How to Cite
LIM, C. S., MALIK, E. F., KHAW, K. W., ALNOOR, A., CHEW, X., CHONG, Z. L., & Al Akasheh, M. (2023). Hybrid GA–DeepAutoencoder–KNN Model for Employee Turnover Prediction. Statistics, Optimization & Information Computing, 12(1), 75-90. https://doi.org/10.19139/soic-2310-5070-1799
Section
Research Articles