FORECASTING EMPLOYEE PRODUCTIVITY USING LSTM NETWORKS: SYNTHETIC DATA APPROACH AND BASELINE COMPARISON

Keywords: employee productivity, neural networks, LSTM, IT project management, performance evaluation

Abstract

Forecasting employee productivity is becoming an important task in modern project management environments characterized by high dynamics and digital transformation. The aim of this study is to develop and investigate an approach for predicting employee productivity using Long Short-Term Memory (LSTM) neural networks. The methodology is based on modeling temporal sequences of HR metrics describing employee activity and performance in project teams. The results demonstrate that LSTM models can effectively capture temporal dependencies in productivity indicators and provide accurate forecasts of performance dynamics. The practical value of the study lies in the possibility of integrating such models into HR analytics and project management systems to support performance monitoring and managerial decision-making. In addition, the proposed approach can contribute to improving resource planning and early identification of changes in team performance.

References

Sagheer A., Kotb M. Time series forecasting of petroleum production using deep LSTM recurrent networks. Neurocomputing. 2019. Vol. 323. P. 203–213. DOI: https://doi.org/10.1016/j.neucom.2018.09.082 (accessed November 12, 2025).

Nait Chabane A., Sahnoun M., Bettayeb B. Forecasting KPIs of production systems using LSTM networks. Proceedings of the 9th International Conference on Data Analytics. 2021. DOI: https://doi.org/10.1109/CyMaEn50288.2021.9497278 (accessed December 5, 2025).

Mahammad D. R., Priya S. L., Thulasimani T., Mann S., Mann S., Kalra G. Analysis and prediction of employee turnover characteristics based on LSTM-RNN-HMM model. Journal of Intelligent & Fuzzy Systems. 2025. DOI: https://doi.org/10.1201/9781003559115-56 (accessed December 18, 2025).

Giriprakash A., Arunadevi V., Rajalakshmi K., Khan S. A., Jeyalakshmi R., Valavan M. P. Achieving balanced allocation in human resources through LSTM framework method. Proceedings of the 2024 IEEE International Conference on Smart Data Services. 2025. DOI: https://doi.org/10.1109/ICISCN64258.2025.10934511 (accessed November 22, 2025).

Hochreiter S., Schmidhuber J. Long short-term memory. Neural Computation. 1997. Vol. 9, No. 8. P. 1735–1780. DOI: https://doi.org/10.1162/neco.1997.9.8.1735 (accessed November 3, 2025).

Greff K., Srivastava R. K., Koutník J., Steunebrink B. R., Schmidhuber J. LSTM: A search space odyssey. IEEE Transactions on Neural Networks and Learning Systems. 2017. Vol. 28, No. 10. P. 2222–2232. DOI: https://doi.org/10.1109/TNNLS.2016.2582924 (accessed December 14, 2025).

Goodfellow I., Bengio Y., Courville A. Deep learning. Cambridge: MIT Press, 2016. URL: https://www.deeplearningbook.org (accessed December 9, 2025).

Han J., Kamber M., Pei J. Data mining: Concepts and techniques. 3rd ed. Burlington: Morgan Kaufmann, 2011. DOI: https://doi.org/10.1016/C2009-0-61819-5 (accessed November 27, 2025).

Jolliffe I. T., Cadima J. Principal component analysis: A review and recent developments. Philosophical Transactions of the Royal Society A. 2016. Vol. 374, No. 2065. P. 20150202. DOI: https://doi.org/10.1098/rsta.2015.0202 (accessed November 16, 2025).

Breiman L. Random forests. Machine Learning. 2001. Vol. 45. P. 5–32. DOI: https://doi.org/10.1023/A:1010933404324 (accessed December 21, 2025).

Chollet F. Deep learning with Python. 2nd ed. Shelter Island: Manning Publications, 2021.

Ioffe S., Szegedy C. Batch normalization: Accelerating deep network training by reducing internal covariate shift. Proceedings of the 32nd International Conference on Machine Learning. 2015. P. 448–456. URL: https://arxiv.org/abs/1502.03167 (accessed November 13, 2025).

Géron A. Hands-on machine learning with scikit-learn, Keras, and TensorFlow. 3rd ed. Sebastopol: O’Reilly Media, 2022.

Werbos P. J. Backpropagation through time: What it does and how to do it. Proceedings of the IEEE. 1990. Vol. 78, No. 10. P. 1550–1560. DOI: https://doi.org/10.1109/5.58337 (accessed November 30, 2025).

Hyndman R. J., Athanasopoulos G. Forecasting: Principles and practice. 3rd ed. OTexts, 2021. URL: https://otexts.com/fpp3/ (accessed December 4, 2025).

Makridakis S., Wheelwright S. C., Hyndman R. J. Forecasting: Methods and applications. 3rd ed. New York: Wiley, 1998.

Sagheer A., Kotb M. Time series forecasting of petroleum production using deep LSTM recurrent networks. Neurocomputing. 2019. Vol. 323. P. 203–213. DOI: https://doi.org/10.1016/j.neucom.2018.09.082 (дата звернення: 12.11.2025).

Nait Chabane A., Sahnoun M., Bettayeb B. Forecasting KPIs of production systems using LSTM networks. Proceedings of the 9th International Conference on Data Analytics. 2021. DOI: https://doi.org/10.1109/CyMaEn50288.2021.9497278 (дата звернення: 05.12.2025).

Mahammad D. R., Priya S. L., Thulasimani T., Mann S., Mann S., Kalra G. Analysis and prediction of employee turnover characteristics based on LSTM-RNN-HMM model. Journal of Intelligent & Fuzzy Systems. 2025. DOI: https://doi.org/10.1201/9781003559115-56 (дата звернення: 18.12.2025).

Giriprakash A., Arunadevi V., Rajalakshmi K., Khan S. A., Jeyalakshmi R., Valavan M. P. Achieving balanced allocation in human resources through LSTM framework method. Proceedings of the 2024 IEEE International Conference on Smart Data Services. 2025. DOI: https://doi.org/10.1109/ICISCN64258.2025.10934511 (дата звернення: 22.11.2025).

Hochreiter S., Schmidhuber J. Long short-term memory. Neural Computation. 1997. Vol. 9, No. 8. P. 1735–1780. DOI: https://doi.org/10.1162/neco.1997.9.8.1735 (дата звернення: 03.11.2025).

Greff K., Srivastava R. K., Koutník J., Steunebrink B. R., Schmidhuber J. LSTM: A search space odyssey. IEEE Transactions on Neural Networks and Learning Systems. 2017. Vol. 28, No. 10. P. 2222–2232. DOI: https://doi.org/10.1109/TNNLS.2016.2582924 (дата звернення: 14.12.2025).

Goodfellow I., Bengio Y., Courville A. Deep learning. Cambridge: MIT Press, 2016. URL: https://www.deeplearningbook.org (дата звернення: 09.12.2025).

Han J., Kamber M., Pei J. Data mining: Concepts and techniques. 3rd ed. Burlington: Morgan Kaufmann, 2011. DOI: https://doi.org/10.1016/C2009-0-61819-5 (дата звернення: 27.11.2025).

Jolliffe I. T., Cadima J. Principal component analysis: A review and recent developments. Philosophical Transactions of the Royal Society A. 2016. Vol. 374, No. 2065. P. 20150202. DOI: https://doi.org/10.1098/rsta.2015.0202 (дата звернення: 16.11.2025).

Breiman L. Random forests. Machine Learning. 2001. Vol. 45. P. 5–32. DOI: https://doi.org/10.1023/A:1010933404324 (дата звернення: 21.12.2025).

Chollet F. Deep learning with Python. 2nd ed. Shelter Island: Manning Publications, 2021.

Ioffe S., Szegedy C. Batch normalization: Accelerating deep network training by reducing internal covariate shift. Proceedings of the 32nd International Conference on Machine Learning. 2015. P. 448–456. URL: https://arxiv.org/abs/1502.03167 (дата звернення: 13.11.2025).

Géron A. Hands-on machine learning with scikit-learn, Keras, and TensorFlow. 3rd ed. Sebastopol: O’Reilly Media, 2022.

Werbos P. J. Backpropagation through time: What it does and how to do it. Proceedings of the IEEE. 1990. Vol. 78, No. 10. P. 1550–1560. DOI: https://doi.org/10.1109/5.58337 (дата звернення: 30.11.2025).

Hyndman R. J., Athanasopoulos G. Forecasting: Principles and practice. 3rd ed. OTexts, 2021. URL: https://otexts.com/fpp3/ (дата звернення: 04.12.2025).

Makridakis S., Wheelwright S. C., Hyndman R. J. Forecasting: Methods and applications. 3rd ed. New York: Wiley, 1998.

Article views: 0
PDF Downloads: 0
Published
2026-04-07
How to Cite
Yakymiv, A., & Yehorchenkova, N. (2026). FORECASTING EMPLOYEE PRODUCTIVITY USING LSTM NETWORKS: SYNTHETIC DATA APPROACH AND BASELINE COMPARISON. Economy and Society, (84). https://doi.org/10.32782/2524-0072/2026-84-67