Document Type : Research Paper
Authors
Department of Computer science, University of Baghdad, Baghdad, Iraq
Abstract
Performance issues such as system resources leaking, application hang, and Software Aging (SA) can affect the system's reliability and minimize user experiences. Therefore, these issues need to be analyzed and forecasted to prevent incoming issues. Finding the root cause and analyzing the internal behaviors become troublesome due to the complexity of modern systems such as the Microsoft Windows Operating System OS. Microsoft builds multiple tools and platforms such as the Performance Monitor (PerfMon.exe) tool and Performance Counter for Windows (PCW) platform to monitor the activities inside Windows OS. This paper aims to use Windows OS tools for simulating performance issues in an experiment, data collection, and log format converting. In contrast to other works, the deep learning Long Short-Term Memory (LSTM) method and the Auto-regressive Integrated Moving Average (ARIMA) model were generated and compared. The best model that provides the lowest error rate of the prediction simulated performance issue was selected. The results declare the preference of using the ARIMA model with order (2,1,1) that provides the observed lowest error rate for both MAE and RMSE compared with other values in previous lags. And the observed LSTM has an error rate of 4.796, whereas the ARIMA model has an error rate of 0.0119. From those results, we can confirm of using the ARIMA model with its selected parameters can predict the small jump fluctuations behavior observed from the memory metric.
Keywords