Access the full text.
Sign up today, get DeepDyve free for 14 days.
R. Brown (1960)
Statistical forecasting for inventory control, 123
Sungil Kim, Heeyoung Kim (2016)
A new metric of absolute percentage error for intermittent demand forecastsInternational Journal of Forecasting, 32
J. Cancelo, A. Espasa, Rosmarie Grafe (2008)
Forecasting the electricity load from one day to one week ahead for the Spanish system operatorInternational Journal of Forecasting, 24
G. Bush (1989)
Developments in the continuous galvanizing of steelJOM, 41
M. López, S. Valero, C. Senabre (2017)
Short-term load forecasting of multiregion systems using mixed effects models2017 14th International Conference on the European Energy Market (EEM)
Slawek Smyl (2020)
A hybrid method of exponential smoothing and recurrent neural networks for time series forecastingInternational Journal of Forecasting
Huaizhi Wang, Zhenxing Lei, Xian Zhang, Bin Zhou, Jianchun Peng (2019)
A review of deep learning for renewable energy forecastingEnergy Conversion and Management
Waqas Ahmad, N. Ayub, Tariq Ali, Muhammad Irfan, M. Awais, M. Shiraz, A. Głowacz (2020)
Towards Short Term Electricity Load Forecasting Using Improved Support Vector Machine and Extreme Learning MachineEnergies
N. Tang (1999)
Characteristics of continuous-galvanizing bathsMetallurgical and Materials Transactions B, 30
O. Trull, J. García-Díaz, A. Troncoso (2019)
Application of Discrete-Interval Moving Seasonalities to Spanish Electricity Demand Forecasting during EasterEnergies
Tao Hong, Jingrui Xie, Jonathan Black (2019)
Global energy forecasting competition 2017: Hierarchical probabilistic load forecastingInternational Journal of Forecasting
D. Koller, N. Friedman (2009)
Probabilistic Graphical Models - Principles and Techniques
Rob Hyndman, Yeasmin Khandakar (2008)
Automatic Time Series Forecasting: The forecast Package for RJournal of Statistical Software, 27
P. Winters (1960)
Forecasting Sales by Exponentially Weighted Moving AveragesManagement Science, 6
Hang Xie, Hao Tang, Y. Liao (2009)
Time series prediction based on NARX neural networks: An advanced approach2009 International Conference on Machine Learning and Cybernetics, 3
J. García-Díaz (2008)
Fault detection and diagnosis in monitoring a hot dip galvanizing line using multivariate statistical process control
C. Chatfield (1978)
The Holt-Winters Forecasting ProcedureApplied statistics, 27
S. Alessio (2016)
Digital Signal Processing and Spectral Analysis for Scientists
S. Fallah, Mehdi Ganjkhani, Shahaboddin Shamshirband, K. Chau (2019)
Computational Intelligence on Short-Term Load Forecasting: A Methodological OverviewEnergies
Spyros Makridakis, Evangelos Spiliotis, V. Assimakopoulos (2018)
The M4 Competition: Results, findings, conclusion and way forwardInternational Journal of Forecasting
Rb Cleveland, W. Cleveland, J. McRae, Irma Terpenning (1990)
STL: A seasonal-trend decomposition procedure based on loess (with discussion)
E. Gardner, E. McKenzie (2011)
Why the damped trend worksJournal of the Operational Research Society, 62
James Taylor (2010)
Triple seasonal methods for short-term electricity demand forecastingEur. J. Oper. Res., 204
(2020)
Electricity Load and Price Forecasting Webinar Case Study
Rob Hyndman, A. Koehler, J. Ord, R. Snyder (2008)
Forecasting with Exponential Smoothing: The State Space Approach
(1991)
Time Series: Theory and Methods: Theory and Methods, 2nd ed.; Springer Science & Business Media
James Taylor, P. McSharry (2007)
Short-Term Load Forecasting Methods: An Evaluation Based on European DataIEEE Transactions on Power Systems, 22
A. Debón, J. García-Díaz (2012)
Fault diagnosis and comparing risk for the steel coil manufacturing process using statistical models for binary dataReliab. Eng. Syst. Saf., 100
E. Gardner, E. McKenzie (1985)
Forecasting Trends in Time SeriesManagement Science, 31
James Taylor, R. Buizza (2003)
Using weather ensemble predictions in electricity demand forecastingInternational Journal of Forecasting, 19
E. Gardner (2006)
EXPONENTIAL SMOOTHING: THE STATE OF THE ART, PART IIInternational Journal of Forecasting, 22
H. Hippert, C. Pedreira, R. Souza (2001)
Neural networks for short-term load forecasting: a review and evaluationIEEE Transactions on Power Systems, 16
Rob Hyndman, G. Athanasopoulos (2013)
Forecasting: principles and practice
Ebrahim Ghaderpour, Ebrahim Ghaderpour, E. Ince, S. Pagiatakis (2018)
Least-squares cross-wavelet analysis and its applications in geophysical time seriesJournal of Geodesy, 92
V. Gómez (2015)
SSMMATLAB: A Set of MATLAB Programs for the Statistical Analysis of State Space ModelsJournal of Statistical Software, 066
Jaime Buitrago, S. Asfour (2017)
Short-Term Forecasting of Electric Loads Using Nonlinear Autoregressive Artificial Neural Networks with Exogenous Vector InputsEnergies, 10
O. Trull, J. García-Díaz, A. Troncoso (2020)
Stability of Multiple Seasonal Holt-Winters Models Applied to Hourly Electricity Demand in SpainApplied Sciences
Alysha Livera, Rob Hyndman, R. Snyder (2011)
Forecasting Time Series With Complex Seasonal Patterns Using Exponential SmoothingJournal of the American Statistical Association, 106
G. Sudheer, A. Suseelatha (2015)
Short term load forecasting using wavelet transform combined with Holt–Winters and weighted nearest neighbor modelsInternational Journal of Electrical Power & Energy Systems, 64
J. Nelder, R. Mead (1965)
A Simplex Method for Function MinimizationComput. J., 7
(1970)
Time Series Analysis: Forecasting and Control ; Holden-Day: San Francisco, CA, USA
J. Durbin, S. Koopman (2002)
A simple and efficient simulation smoother for state space time series analysisBiometrika, 89
R. Shumway, D. Stoffer (2011)
Characteristics of Time Series
Ebrahim Ghaderpour, S. Pagiatakis (2019)
LSWAVE: a MATLAB software for the least-squares wavelet and cross-wavelet analysesGPS Solutions, 23
R. Weron (2006)
Modeling and Forecasting Electricity Loads and Prices: A Statistical Approach
T. Choi, Yong Yu, K. Au (2011)
A hybrid SARIMA wavelet transform method for sales forecastingDecis. Support Syst., 51
P. Bickel, Y. Gel (2011)
Banded regularization of autocovariance matrices in application to parameter estimation and forecasting of time seriesJournal of the Royal Statistical Society: Series B (Statistical Methodology), 73
William Larimer (2007)
GRADUATE SCHOOL OF INDUSTRIAL ADMINISTRATION
Grzegorz Dudek (2016)
Pattern-based local linear regression models for short-term load forecastingElectric Power Systems Research, 130
J. Lagarias, J. Reeds, M. Wright, P. Wright (1998)
Convergence Properties of the Nelder-Mead Simplex Method in Low DimensionsSIAM J. Optim., 9
F. Ajersch, F. Ilinca, J. Hétu (2004)
Simulation of flow in a continuous galvanizing bath: Part II. Transient aluminum distribution resulting from ingot additionMetallurgical and Materials Transactions B, 35
A. Kavousi-fard, Mohammad-Reza Akbari-Zadeh (2014)
A hybrid method based on wavelet, ANN and ARIMA model for short-term load forecastingJournal of Experimental & Theoretical Artificial Intelligence, 26
J. Bermúdez (2013)
Exponential smoothing with covariates applied to electricity demand forecastEuropean Journal of Industrial Engineering, 7
G. Zahedi, S. Azizi, A. Bahadori, A. Elkamel, S. Alwi (2013)
Electricity demand estimation using an adaptive neuro-fuzzy network: A case study from the Ontario province – CanadaEnergy, 49
S.M.A. Shibli, B. Meena, R. Remya (2015)
A review on recent approaches in the field of hot dip zinc galvanizing processSurface & Coatings Technology, 262
J. Breidt (2005)
Nonlinear Time Series: Nonparametric and Parametric MethodsJournal of the American Statistical Association, 100
Niematallah Elamin, M. Fukushige (2018)
Modeling and forecasting hourly electricity demand by SARIMAX with interactionsEnergy
Ebrahim Ghaderpour, T. Vujadinovic (2020)
The Potential of the Least-Squares Spectral and Cross-Wavelet Analyses for Near-Real-Time Disturbance Detection within Unequally Spaced Satellite Image Time SeriesRemote. Sens., 12
James Taylor, L. Menezes, P. McSharry (2006)
A comparison of univariate methods for forecasting electricity demand up to a day aheadInternational Journal of Forecasting, 22
J. Bermúdez, J. Segura, E. Vercher (2006)
Improving demand forecasting accuracy using nonlinear programming softwareJournal of the Operational Research Society, 57
(1983)
Machine Learning
(1983)
Eds.) Machine Learning. An Artificial Intelligence Approach
J. Taylor (2003)
Short-term electricity demand forecasting using double seasonal exponential smoothingJournal of the Operational Research Society, 54
Ebrahim Ghaderpour, S. Pagiatakis (2017)
Least-Squares Wavelet Analysis of Unequally Spaced and Non-stationary Time Series and Its ApplicationsMathematical Geosciences, 49
J. García-Díaz, O. Trull (2016)
Competitive Models for the Spanish Short-Term Electricity Demand Forecasting
J. Moré (1977)
Levenberg--Marquardt algorithm: implementation and theory
Y. Lu, S. Abourizk (2009)
Automated Box–Jenkins forecasting modellingAutomation in Construction, 18
J. Roux (2003)
An Introduction to the Kalman Filter
Ahsan Khan, S. Razzaq, T. Alquthami, M. Moghal, A. Amin, A. Mahmood (2018)
Day ahead load forecasting for IESCO using Artificial Neural Network and Bagged Regression Tree2018 1st International Conference on Power, Energy and Smart Grid (ICPESG)
Spyros Makridakis, M. Hibon (1997)
ARMA Models and the Box–Jenkins MethodologyJournal of Forecasting, 16
R. Tibshirani (2011)
Regression shrinkage and selection via the lasso: a retrospectiveJournal of the Royal Statistical Society: Series B (Statistical Methodology), 73
S. Bercu, F. Proïa (2012)
A SARIMAX coupled modelling applied to individual load curves intraday forecastingJournal of Applied Statistics, 40
Qianjie Liu, Wei Chen, Huosheng Hu, Qingyuan Zhu, Zhixiang Xie (2020)
An Optimal NARX Neural Network Identification Model for a Magnetorheological Damper With Force-Distortion Behavior, 7
Arjun Baliyan, K. Gaurav, S. Mishra (2015)
A Review of Short Term Load Forecasting using Artificial Neural Network ModelsProcedia Computer Science, 48
J. Segura, E. Vercher (2001)
A spreadsheet modeling approach to the Holt-Winters optimal forecastingEur. J. Oper. Res., 131
Article Forecasting Irregular Seasonal Power Consumption. An Application to a Hot‐Dip Galvanizing Process 1, 1 2 Oscar Trull *, J. Carlos García‐Díaz and Angel Peiró‐Signes Department of Applied Statistics, Operational Research and Quality, Universitat Politècnica de València, E‐46022 Valencia, Spain; juagardi@eio.upv.es Management Department, Universitat Politècnica de València, E‐46022 Valencia, Spain; anpeisig@omp.upv.es * Correspondence: otrull@eio.upv.es Featured Application: The method described in this document makes it possible to use the tech‐ niques usually applied to load prediction efficiently in those situations in which the series clearly presents seasonality but does not maintain a regular pattern. Abstract: Distribution companies use time series to predict electricity consumption. Forecasting techniques based on statistical models or artificial intelligence are used. Reliable forecasts are re‐ quired for efficient grid management in terms of both supply and capacity. One common underly‐ ing feature of most demand–related time series is a strong seasonality component. However, in some cases, the electricity demanded by a process presents an irregular seasonal component, which prevents any type of forecast. In this article, we evaluated forecasting methods based on the use of multiple seasonal models: ARIMA, Holt‐Winters models with discrete interval moving seasonality, and neural networks. The models are explained and applied to a real situation, for a node that feeds a galvanizing factory. The zinc hot‐dip galvanizing process is widely used in the automotive sector for the protection of steel against corrosion. It requires enormous energy consumption, and this has a direct impact on companies’ income statements. In addition, it significantly affects energy distri‐ Citation: Trull, O.; García‐Díaz, C.J.; bution companies, as these companies must provide for instant consumption in their supply lines Peiró‐Signes, A. Forecasting Irregu‐ to ensure sufficient energy is distributed both for the process and for all the other consumers. The lar Seasonal Power Consumption. results show a substantial increase in the accuracy of predictions, which contributes to a better man‐ An Application to a Hot‐Dip Galva‐ nizing Process. Appl. Sci. 2020, 11, 75. agement of the electrical distribution. https://doi.org/10.3390/app11010075 Keywords: time series; demand; load; forecast; DIMS; irregular; galvanizing Received: 1 November 2020 Accepted: 22 December 2020 Published: 23 December 2020 1. Introduction Publisher’s Note: MDPI stays neu‐ Demand management is a primary process in the development of industrial activity. tral with regard to jurisdictional Distribution companies must ensure a supply is provided at a reasonable cost, and for claims in published maps and insti‐ this reason, they need to manage resources efficiently. The use of electrical prediction tutional affiliations. models contributes to their management of the distribution lines by offering tools to esti‐ mate future demand with great precision. The techniques allow for forecasting based on time series using statistical models or artificial intelligence (AI). Copyright: © 2020 by the authors. The most widely used univariate forecasting tools for electricity demand can be clas‐ Licensee MDPI, Basel, Switzerland. sified into three broad groups [1]: fundamental models, statistical models, and computa‐ This article is an open access article tional models. There is growing interest in the use of computational models, although the distributed under the terms and most widely used models are statistical models, both exponential smoothing models and conditions of the Creative Commons autoregressive integrated moving average (ARIMA) models. Attribution (CC BY) license The fundamental models are made up of hybrid models that introduce all the possi‐ (http://creativecommons.org/licenses ble physical variables, adopting a complex relationship between them and also using the /by/4.0/). techniques of statistical models. Appl. Sci. 2021, 11, 75. https://doi.org/10.3390/app11010075 www.mdpi.com/journal/applsci Appl. Sci. 2021, 11, 75 2 of 24 Computational models are based on AI and emulate natural behaviors through the use of mathematical models. These are algorithms whose learning is automatic and are part of the science of Machine Learning [2]. At present, deep learning techniques represent an evolution and have found applications in demand forecasting, especially in areas where prediction is difficult, such as renewable energies [3]. The most widely used tech‐ niques for electricity demand are artificial neural networks (ANN) [4], particularly non‐ linear autoregressive neural networks with exogenous variables (NARX) [5,6]. Support vector machines (SVM) [7] and bagged regression trees (BRT) [8] also stand out, and these occasionally apply fuzzy logic [9]. Electricity demand series show stochastic behavior, and they have traditionally been modeled using statistical methods. The ARIMA models are considered to be the econo‐ metric models par excellence. The Box–Jenkins methodology [10] is used to determine which ARIMA model to use, although some authors [11] state that simpler methods are better than this methodology at providing forecasts. The application of ARIMA models to demand is usually carried out in a general way in Seasonal Autoregressive Integrated Moving Average Exogenous (SARIMAX) models [12–14] in which exogenous variables are included to improve demand. The introduction of two seasonalities allows substantial improvement in the predictions of these models [15]. State‐space models (SSM) are a form of exponential smoothing representation. They are commonly applied to demand [16], especially since the introduction of the Kalman filter (see [17]). They also allow the introduction of various seasonalities in a complex way [18] and with covariates [19]. De Livera and Hyndman include modifications that include adjustment of the error using autoregressive moving average (ARMA) models and with Box‐Cox transformations (BATS [20]) and trigonometric seasonality (TBATS [18]). Other very common smoothing techniques are the Holt‐Winters models [20]. These models are excellent predictors for time series with marked seasonality [1,21]. The inclu‐ sion of more seasonality [22–24] improves their forecasts, leading to the development of multiple seasonal Holt‐Winters models (nHWT). Trull et al. [25] introduce discrete sea‐ sonality that takes into account seasonalities whose occurrences are not regular (nHWT‐ DIMS models). The current trend is to create hybrid models in which traditional techniques are com‐ bined with machine learning [26,27]. An example can be found in [28], which applies an exponential smoothing method and neural networks to divide the forecasting process be‐ tween a linear part and a non‐linear part. The use of wavelets for irregular series has been combined with ARIMA models [29], Holt‐Winters models [30], or ANN [31]. Regularization techniques have also been applied to prevent over‐ and under‐fitting issues, based on a Least Absolute Shrinkage and Selection Operator (LASSO) [32], and have been applied to short‐term load forecasting models based on multiple linear regres‐ sion [33]. Banded regularization is also used to estimate parameters without overfitting in autoregressive models [34]. Newer methods use an anti‐leakage least‐squares spectral analysis (ALLSSA) to sim‐ ultaneously estimate the trend and seasonal components before making a regularization and make forecasts [35]. The ALLSSA method determines the statistically significance of the components preventing under‐ and over‐fitting issues. The least‐squares wavelet anal‐ ysis (LSWA) is a natural extension of the least‐squares spectral analysis and allows the forecaster to obtain spectrograms for equally and unequally spaced time series and iden‐ tify statistically significant peaks in the time series [36]. One common feature of most demand‐related time series is their strong seasonality components [37]. In some cases, the electricity demanded by a process could present an irregular seasonal component that seriously distorts the behavior of the series in a way that the models cannot deal with. The zinc hot‐dip galvanizing process is a process that is widely used in the automo‐ tive sector to protect steel against corrosion [38,39]. It requires an enormous consumption Appl. Sci. 2021, 11, 75 3 of 24 of energy, and this has a direct impact on companies’ income statements. However, the process also significantly affects energy distribution companies, since they must foresee the instantaneous consumption in their lines in order to ensure the distribution of energy both for the process and for the other consumers. A characteristic of the demand in this process is the presence of seasonal patterns that resemble seasonality but, because of their irregular behavior, are difficult to assimilate to seasonality. The structure shown by the series in this study means that it is more suitable to work with time series models rather than frequency or signal analysis. We have therefore con‐ sidered it convenient to preferably use traditional time series models with seasonality. In this article, we present several solutions to this problem based on the use of ARIMA models, multiple seasonal Holt‐Winters models with and without discrete inter‐ val moving seasonalities (DIMS), state space models, and neural network models. To ver‐ ify the effectiveness of the techniques described, they are applied to the industrial process of hot‐dip galvanizing. The article is organized as follows: Section 2 conducts a review of the forecasting methods as well as an explanation of the production process; Section 3 demonstrates the results and their analysis; Section 4 discusses the results; and finally, in Section 5, the con‐ clusions are summarized. 2. Materials and Methods 2.1. Study Area The study has been applied to the consumption node of a hot‐dip galvanizing com‐ pany. The process is carried out by coating extruded steel strips with a zinc foil that forms an alloy with the steel and gives the desired properties. This process is continuous and produces high‐quality products [40]. Figure 1 shows a general scheme for the galvanizing process, where the greatest con‐ sumption is in the zinc bath. In the annealing furnace, the steel strip is preheated, and then it is immersed in a bath of molten zinc at 460 °C. Subsequently, the galvanized steel strip goes through the skin‐pass process [41–43] after it has cooled down. Figure 1. Representation of a hot dip galvanizing process. The zinc bath consists of a molten alloy of Zn in a bath, which is kept at 460 °C by the action of two heating inductors located at the bottom of the bath. Figure 2 schematically shows the operation of the zinc bath. The bath temperature is measured as an average of the local temperatures provided by the thermocouples Ta, Tb and Tc. The inductors heat from the bottom of the bath and a natural flow inside the bath is produced so that the bath achieves the targeted temperature. Appl. Sci. 2021, 11, 75 4 of 24 Hot-dipping Coated steel Pretreated steel process strip zinc bath Ta Tb Tc (460 ºC) Induction heating Zinc Pot section Figure 2. Galvanizing section (hot dip zinc bath). Pretreated steel goes into the zinc pot, which is filled with an Al–Zn solution at 460 °C. Thermocouples Ta, Tb and Tc measure the local tempera‐ tures in the bath. Induction heaters located at the base of the bath keep the temperature as tar‐ geted. After the bath, the steel is coated with Zn. The electrical consumption associated with the process can be seen in Figure 3. This graph shows the consumption for eight working days, measured every six minutes. It begins on November 14th, 2009 at 00:00 am and ends on November 22nd, 2009 at 08:00 am. There are in total 2000 measurements. The oscillations shown in the time series are produced by the action of induction heaters that keep the bath at the targeted temperature. The big peaks in consumption are produced when the bath needs to be recharged, and new Zn (dropped in ingots) is added into the bath. At this moment, the heaters must be put into full operation. From this dataset, the first 1800 observed values are used for train‐ ing purposes, and the last 200 ones are used for validation. Figure 3. Electricity demand for the hot‐dip galvanizing. The ticks represent the beginning of each day. The blue dataset designates the data used for training, whereas the red one represents the data used for testing and validation. Appl. Sci. 2021, 11, 75 5 of 24 A series of cyclical patterns can be observed (oscillations) that are repeated through‐ out the series, with a short–length pattern that is repeated continuously throughout the series clearly standing out. There are other patterns that are repeated irregularly, with differentiated behaviors. A closer view of the series is shown in Figure 4. In graph (a) and graph (b), a common underlying pattern can be identified, with a length of around ten time units (which means an hour, as the temperature is measured every six minutes). This first pattern is repeated regularly over the whole time period, and it is considered as a seasonality. Figure 4. Close‐up version of Figure 3, where different seasonal patterns can be located: a first pattern along the whole series, with sort oscillations as shown in (a,b); and a second pattern covering the consumption peaks, as shown in (c,d). This second pattern has a different length on every appearance. Figure 4, graph (c) and (d) show the time series after removing the first seasonal pat‐ tern. It can also be seen that other patterns develop throughout the series in such a way that the time of their appearance or their length are not constant. Technically, this non‐ regular behavior cannot be considered as a seasonality, since it is not possible to predict the fluctuation pattern that will develop in the future. To make consumption predictions, it is necessary to take into account this seasonal behavior, even though it is not regular. Appl. Sci. 2021, 11, 75 6 of 24 2.2. Forecasting Methods In this section, we describe the forecasting methods applied to the time series under study. The most common methods applied to short‐term electricity demand forecasting, using both AI and statistical methods, have been chosen. First, the methods used with regular seasonal behavior are described, and then we describe the models with discrete seasonality. 2.2.1. Artificial Neural Networks Neural networks are computational models structured in the form of layers with nodes interconnected as a network. They are named because of their resemblance to the human brain structure. The nodes, called neurons, perform simple operations in parallel and are located in each of the layers of the network. The layers are of three different types: the input layer, where neurons receive direct information from the inputs; hidden layer (s), whose neurons use the information from the neurons of previous layers and feed the next layers; and the output layer, where neurons use the information from the hidden layers to produce an output. Thus, there is an input layer, one or more hidden layers, and an output layer. The connections between the different layers are made through the con‐ nection of their neurons, which are called synapses. The strength of the connection be‐ tween neurons is determined by a weighting established at the synapse. The most suitable structure for forecasting time series is the NARX type structure [44,45]. It is a recurrent dynamic neural network, with feedback connections. Figure 5 shows a close‐loop representation of the NARX structure [46]. Neurons receive infor‐ mation from exogenous input variables in addition to the target series itself and the feed‐ backs. In order to improve forecasts, it can be used the past predicted and observed values delayed through a tapped delay line (TDL) memory. The circles after the input layers de‐ note the TPL delay (e.g., one to two delays in the figure). Figure 5. NARX neural network schema. There is an input layer with variables, one hidden layer and one output layers. Circles represent tapped delay line (TDL). The input variables 𝑥 are exogenous variables used in the model. Both 𝑥 and 𝑦 are connected by axioms to which weights 𝑤 are assigned, and with an activation func‐ tion f that is integrated with an aggregation function Σ. The output 𝑦 provides future forecasts after the network has been trained. 𝑏 stands for the bias whose presence in‐ creases or decreases the neuron’s processing capacity. The mathematical and non‐linear representation that governs the network is shown in (1), where 𝑥 represents the inputs and 𝑦 the objective function, while 𝑦 repre‐ sents the prediction. 𝐷 and 𝐷 are the time delays applied in the network. 𝑦 𝑓 𝑥 ,𝑥 ,…,𝑥 ,𝑦 ,𝑦 ,…,𝑦 . (1) The NARX neural network maps the function through the multilayer perceptron, us‐ ing the time delays for both the input variables and the output feedback [47]. Appl. Sci. 2021, 11, 75 7 of 24 An alternative to this neural network is a function fitting neural network. This is a type of shallow neural network based on multilayer perceptron (MLP) with which we can make adjustments to non‐linear functions (non‐linear regression, NLR). The use and ap‐ plication of such a network for the prediction of electricity demand has been discussed previously [48]. The mathematical representation that governs this network is shown in (2). 𝑦 𝑓 𝑥 ,𝑥 ,… ,𝑥 . (2) Here 𝑥 are the predictors, which are several variables (including the observed val‐ ues of the time series) used to feed the model. A representative schema for this neural network is shown in Figure 6. Figure 6. Function fitting neural network schema. By training the network, weights are assigned to the synaptic connections, minimiz‐ ing an error criterion. The ANNs used in this work are trained using the Levenberg‐Mar‐ quardt algorithm [49], and minimizing the mean squared error (MSE). After the training process, to give the predictions, a closed loop network is performed, and forecasts are provided. 2.2.2. ARIMA Models ARIMA models were introduced by Box and Jenkins [50] to model non–stationary series and allow predictions to be made. A description and in‐depth analysis can be found in [51] and in the book by Brockwell and Davis [52]. Seasonal ARIMA models are usually denoted by ARIMA 𝑝 ,𝑑 ,𝑞 𝑥 𝑃 ,𝐷 ,𝑄 . S indicates the length of the seasonal pattern under consideration. The compact representation of the ARIMA model is usually, as shown in (3), a function of autoregressive polynomials and polynomials of moving means, and of the difference operators. 𝜙 𝐵 𝛷 𝐵 𝛻 𝛻 𝑐𝜃 𝐵 𝜀 , (3) 𝜀 ~N 0,𝜎 . 𝑦 ,𝑡 0, 1, 2, … are the observed data of the univariate series. If the variability in the data grows with time, it is necessary to transform the data to stabilize the variance. The Box‐Cox power transformation family is a general class of variance‐stabilizing trans‐ formations. The Box‐Cox transformation of 𝑦 with power parameter 𝜆 to the trans‐ formed data 𝑦 is defined by (4). 𝑦 1 ; 𝑖𝑓 𝜆 0, 𝑦 (4) 𝑙𝑛 𝑦 ; 𝜆 0. The power parameter 𝜆 is estimated by the maximum–likelihood method. The pol‐ ynomials 𝜙 𝐵 1 𝜙 𝐵𝜙 𝐵 ⋯𝜙 𝐵 and 𝜃 𝐵 1 𝜃 𝐵𝜃 𝐵 ⋯ 𝜃 𝐵 represent the regular or non–seasonal autoregressive and the moving averages com‐ ponents, respectively, and the polynomials Φ 𝐵 1 Φ 𝐵 Φ 𝐵 ⋯Φ 𝐵 and Θ 𝐵 1 Θ 𝐵 Θ 𝐵 ⋯Θ 𝐵 represent the seasonal autoregressive and 𝑖𝑓 𝛩𝐵 𝑦 Appl. Sci. 2021, 11, 75 8 of 24 the moving averages components, respectively, with 𝐵 as the lag operator. ∇ is the is the backward difference operator, 𝑦 𝑦 ; 𝐵 𝑦 𝑦 ; ∇ 1𝐵 ; ∇ 1𝐵 ; ∇ 1𝐵 . d and D are the number of differencings required to make the time series stationary (d, D ≤ 2). 𝜀 is a Gaussian white noise process, [𝜀 ~𝑁 0,𝜎 . c is the model constant. The orders of the polynomials {p, d; P, Q} are selected using the Akaike’s Information Criterion (AIC, AICc) or Schwarz’s or the Bayesian Information Criterion (SIC or BIC). The model coefficients 𝜙 ,𝜙 ,… ,𝜙 ; 𝜃 ,𝜃 ,…,𝜃 ; Φ ,Φ ,… ,Φ ;Θ ,Θ ,… ,Θ and 𝜎 are estimated by the maximum likelihood method. ARIMA models can present more than one seasonality, as indicated in (5). To do this, the models are expressed as ARIMA 𝑝 ,𝑑 ,𝑞 𝑥 𝑃 ,𝐷 ,𝑄 𝑥 𝑃 ,𝐷 ,𝑄 where 𝑆 𝑑 𝑆 indicate the two seasonalities to which they refer. 𝜙 𝐵 Φ 𝐵 Ω 𝐵 ∇ ∇ ∇ c𝜃 Θ 𝐵 Ψ 𝐵 𝜀 . (5) The polynomials Ω 𝐵 and Ψ 𝐵 represent the second seasonal autoregres‐ sive and the moving averages components, respectively. 2.2.3. Multiple Seasonal Holt‐Winters Models Exponential smoothing uses information from the past through weighted averages to make predictions. The weight decreases as newer values are entered into the time series, giving more importance to newer data over older. A smoothing parameter determines this weight. The introduction of these models dates back to the 1960s with the work of Holt [53] and Brown [54]. Winters [20] presented the Holt‐Winters models, in which ex‐ ponential smoothing techniques are performed on the three components of the series: level (𝑙 , trend (𝑏 ) and seasonality (𝑠 ). The model includes a series of structured equa‐ tions, called smoothing equations, the information from which is compiled by a forecast equation to provide forecasts. The equations can be combined with additive or multipli‐ cative trends and seasonality. Gardner and McKenzie [55] introduced a damping factor for the trend, and their model outperforms the previous models when the trend shows high variations [56]. Tay‐ lor broke down seasonality into two or three nested components so that the models can capture the series that present more than one seasonality, such as series for short‐term demand [22,23]. Taylor also included in the model an adjustment using the one‐step‐ ahead error as proposed by Chatfield [57]. This adjustment adds an AR(1) model for the residuals, obtaining the parameter at the same time as the smoothing parameters are ob‐ tained. In the same way, García‐Díaz and Trull [24] generalized the model including the way the initial values are obtained, to n seasonalities. The nHWT models are shown in Equations (6)–(9). 𝑙 𝛼 1𝛼 𝑙 𝜚𝑏 , (6) ∏𝑠 𝑏 𝛾 𝑙 𝑙 1𝛾 𝜚𝑏 , (7) 𝑠 𝛿 1𝛿 𝑠 , (8) 𝑙 ∏ 𝑠 𝑦 𝑙 𝜚 𝑏 𝑠 𝜑 𝜀 . (9) The smoothing equations include the smoothing parameters 𝛼 , 𝛾 and 𝛿 for smoothing the level, the trend and the different seasonal indices 𝑖 of length 𝑠 . The 𝐵 𝑦 𝑎𝑛 𝐵 Appl. Sci. 2021, 11, 75 9 of 24 equation 𝑦 provides the k–future prediction values from the observed values of the series 𝑦 . Here, 𝜀 is the one–step–ahead error, and the parameter 𝜑 is the parameter for the AR(1) adjustment. The damping parameter for the trend is denoted by 𝜚 [58]. The equations of the model are recursive, and therefore they need initial values so that they can fit the model. Several methodologies for initialization have been docu‐ mented [56,57]. To be able to use the models, it is necessary to estimate the smoothing parameters by minimizing the error using non‐linear algorithms [59,60]. The Nelder‐ Mead [61,62] simplex method has been used, which minimizes the root of mean squared error (RMSE). 2.2.4. State Space Models The SSM refers to a form of graphical–probabilistic representation [63] to describe the dependence between an observed measurement and a series of latent state variables through equations called state equations that describe its evolution. Taking into account the fact that a time series can be decomposed into components of level, seasonality and trend, this terminology applied to time series would be understood as the model that in‐ terprets the evolution of the relationship of the observed variables (𝑦 ) with the latent un‐ observable variables (level, trend, and seasonality). SSMs have a great variety of formulations. In this paper, the formulation indicated by Durbin and Koopman [64] and Hyndman et al. [16] applied to univariate stochastic time series is used. These models are structured through a formulation of two matrix equations, as shown in (10)–(11): 𝑦 𝜇 𝒓𝑥 ε , (10) (11) 𝑥 𝒇 𝑥 𝒈𝜀 . Equation (11) is known as the state transition equation, and Equation (10) is known as the observation equation. Here 𝑥 is known as the vector of states, 𝑦 is the vector of observations, while 𝜀 is a vector of Gaussian white noise and is known as the innova‐ tion process. 𝒓 , 𝒇 and 𝒈 are matrices and vectors of coefficients with appropriate dimen‐ sions. 𝒇 explains the evolution of 𝑥 and 𝒈 provides the innovation correction of 𝜀 . The term 𝜇 is the one step ahead forecast, and 𝒓 is a term to include the error additively. De Livera [18] introduced modified models, based on the exponential smoothing methods, in which a Box‐Cox transformation is applied to the data, and the residuals are modeled using an ARMA process and include the damping factor for trend and multiple seasonalities. The acronym for this method is BATS (Box‐Cox transform ARMA errors Trend and Seasonal Components). This model is described in (12)–(16). 𝑦 𝑙 𝜚𝑏 𝑠 𝑑 , (12) (13) 𝑙 𝑙 𝜚𝑏 𝛼𝑑 , (14) 𝑏 1𝜚 𝑏𝑏 𝛾𝑑 , 𝑠 𝑠 𝛿 𝑑 , (15) (16) 𝑑 𝜑 𝑑 𝜃 𝜀 𝜀 . In these equations, 𝑦 indicates the value of the observed data after the Box‐Cox transformation with the value 𝜆 , described in (4). 𝑙 , 𝑏 and 𝑠 are the values of the level, trend and seasonalities with smoothing parameters 𝛼 , 𝛾 and 𝛿 . The subscript 𝑖 Appl. Sci. 2021, 11, 75 10 of 24 denotes the seasonality under consideration, of seasonal length 𝑠 , and 𝑛 is the number of seasonalities. 𝑑 is an ARMA (p, q) process with residuals whose coefficients are deter‐ mined by 𝜑 and 𝜃 . 𝜚 is the damping factor for trend. The term 𝑏 stands for a long– run trend term. 𝜀 is a Gaussian white noise process 𝑁 0,𝜎 . The nomenclature for BATS includes the following arguments 𝜆 ,𝜚 ,𝑝 ,𝑞 ,𝑠 , . ..,𝑠 . Additionally, De Livera et al. [18] presented the same model but with seasonality based on trigonometric models. The seasonality Equation (16) is replaced by the set of Equations (17)–(20), a seasonal component based on Fourier series. These are known as TBATS (Trigonometric seasonal BATS). ∑ (17) 𝑠 𝑠 , 𝑠 =𝑠 cos 𝜔 𝑠 sin 𝜔 𝛿 𝑑 , (18) , , , , , ∗ ∗ 𝑠 = sin 𝜔 𝑠 cos 𝜔 𝛿 𝑑 , (19) , , , , , 𝜔 2 /𝑠 . (20) Every seasonal component of the model 𝑠 results from the sum of the 𝑘 stochas‐ tic levels 𝑠 of period i. 𝑠 is the stochastic growth for each period. 𝛿 and 𝛿 are , , the smoothing parameters. The nomenclature for TBATS includes the following argu‐ ments 𝜆 ,𝜚 ,𝑝 ,𝑞 ,𝑠 ,𝑘 ,. ..,𝑠 ,𝑘 . Obtaining the values of the previous matrices and vectors requires the application of an algorithm based on the Kalman filter and the maximum likelihood function using the sum of squared errors (SSE) as the minimization criterion. This algorithm carries a high computational load and manages to obtain these parameters iteratively. The reference [18] explains in detail the process to be carried out in order to use the BATS and TBATS meth‐ ods. 2.2.5. Multiple Seasonal Holt‐Winters Models with Discrete Interval Moving Seasonali‐ ties nHWT models are robust to variations in the series, but sometimes special situations occur in which it is interesting to take these anomalies into account. One of the clearest examples is the influence of the calendar effect on electricity demand [65]. These anoma‐ lous and specific situations can sometimes be modeled as a discrete seasonality, if they follow a repetitive pattern. Despite being seasonal, since they are discrete, they have the particular quality that they are not located at fixed moments in the time series; therefore, they are not linked to a deterministic appearance, as would be the case for regular season‐ ality. These seasonalities are called discrete interval moving seasonality (DIMS). Trull et al. [25] include the use of discrete seasonality in their model, so that the model seen in (6)–(9) now results in (21)–(25), which is named nHWT–DIMS: 𝑙 𝛼 1𝛼 𝑙 , (21) ∏ ∏ 𝑠 𝐷 ∗ (22) 𝑏 𝛾 𝑙 𝑙 1𝛾 , 𝑠 𝛿 1 𝛿 𝑠 , (23) 𝑙 ∏ 𝑠 ∏ 𝐷 ∗ ∗ 𝜚𝑏 𝜚𝑏 𝜋𝑗 𝑠 Appl. Sci. 2021, 11, 75 11 of 24 𝐷 ∗ 𝛿 1𝛿 𝐷 ∗ ∗ , (24) 𝑙 ∏ 𝑠 ∏ 𝐷 ∗ ∗ 𝑦 𝑙 𝜚 𝑏 𝑠 𝐷 𝜑 𝜀 . ∗ ∗ (25) Here the term 𝐷 is included, which represents the discrete seasonal indices, for each DIMS ℎ considered up to 𝑛 . DIMS are only defined in the time intervals in which the special event takes place. These time intervals are designated using 𝑡 for each DIMS (h). This nomenclature is chosen in order to distinguish this from the continuous time interval 𝑡 . The great difference between this model and other methods of modeling special sit‐ uations is that the effect produced by the anomaly in the series is modeled as an internal part of the model, as one more seasonality, and is smoothed with each new appearance, unlike the use of models with dummy variables and/or modifications of the original se‐ ries. In the nHWT models, the seasonality equation shows a fixed recurrence for each sea‐ sonal pattern (𝑠 ) being considered. With DIMS, this is not possible, since the occurrences of special events are not subjected to a deterministic pattern in the series. Therefore, the use of the variable 𝑠 indicates, for each DIMS and each occurrence, which is the recur‐ rence to consider. One possible situation with special events is the simultaneous occurrence of two events. In such a case, the forecaster should consider the option of using only one of the DIMS that occur at that time, or using both, if the effects produced by the two special events add up. An important aspect to consider is the initialization of the DIMS. A DIMS may have few occurrences in the time series and, therefore, its seasonal indexes must be calculated in such a way that it converges rapidly to the desired effect. The initialization method consists in first obtaining the initial values of the level, the trend, and the seasonal indices for the regular seasonality. Subsequently, a decomposition of the series is carried out using trend and multiple seasonality. It is common to use the multiple STL method (Seasonal–Trend decomposition procedure using Loess [66], where Loess is a method to estimate linear relationships). From the decomposition, the series can be reconstructed without including the irreg‐ ular part, which is where the information necessary to obtain the desired indices is found. The initial values are obtained by weighting the time series against the reconstructed se‐ ries. The adjustment of the parameters is carried out following the same procedure as for the nHWT, with the exception that, if necessary, this adjustment can be carried out in two steps—first adjusting the parameters of the regular model and then adjusting the param‐ eters associated with the DIMS. Adjusting all the parameters simultaneously obtains mod‐ els with more reliable predictions, while the second option is faster. Thus, the first option is chosen for this work. 3. Results The approach proposed for the work described below has the following scheme. First, a study of the series is carried out to determine the seasonal periods. The study is carried out using continuous seasonality models and discrete seasonality models, all as described in the previous section. Although it is preferable to use an error minimization criterion for each technique when fitting the models, the RMSE—defined in (26)—is used to standardize and compare the fitted results. Appl. Sci. 2021, 11, 75 12 of 24 (26) RMSE 𝑦 𝑦 . Here N stands for length of the dataset used for the training. The final comparison will be made according to the forecasts made in the validation set. 3.1. Analysis of the Seasonality of the Series The series shown in Figure 3 clearly presents a seasonality with periodicity of one hour. However, to study the following seasonal patterns it is necessary to perform an analysis on the frequency domain. To investigate the appreciable frequencies in the time series, a spectral density analysis is carried out, the result of which is shown in Figure 7 in the form of a smoothed periodogram. A smoothed periodogram is the preferred tool here as the periodic cycles do not show a regular periodicity [67]. Figure 7. Smoothed periodogram obtained from the time series shown in Figure 3. Analyzing the figure, the presence of a clearly dominant frequency is observed, which corresponds to the periodicity of ten units of time (one hour). Also, the presence of another dominant frequency can be observed. This corresponds to a second seasonality with a period of 106 time‐units. However, this is the second seasonality and is associated with a greater variability around its value, which confirms what is seen in Figure 3. To confirm these hypotheses, an ALLSSA analysis is performed. This method is ro‐ bust against unequally spaced time series, estimating trend and seasonal components sim‐ ultaneously, and providing statistically significant components in the time series [35]. The analysis shows three main and significant frequencies at periodicities of 10, 118, and 203 time‐units. This disagreement between the two methods suggests that, despite various seasonalities clearly coexisting, non‐dominant seasonalities do not occur continuously and may influence the analysis. In contrast to this result, an analysis based on the use of wavelets is also carried out. The advantage of using wavelets to analyze the spectral content of the series is that we obtain a map in the time‐scale plane. The concept of frequency in spectral analysis is now Appl. Sci. 2021, 11, 75 13 of 24 replaced by the scale factor, and therefore, instead of using a periodogram, we use a sca‐ logram. The scale measures the stretching of the wavelets, being directly related to fre‐ quency, as the greater the scale is, the higher the frequency of the series, which is related to the inverse of a frequency, that is, to a period [68]. This map allows the non–stationary characteristics of the signal, including changes in periodicity, to be studied, which is the objective. Figure 8 shows the average wavelet power graph. This graph shows the means of the powers developed over time for each period or frequency. Although the results are similar to those shown in the previous graph, a greater variability is observed in the longer peri‐ ods. Three main periods are located at 10, 94, and 196 time units. The results are very close to the previous one. 1024 0.05 0 0.2 0.4 0.6 0.8 average wavelet power Figure 8. Plot of wavelet power averages across time. The red bullets show the significance level (0.05). The need for a robust analysis using the time and frequency domain motivates the use of LSWA [35,69]. The software LSWAVE [70] in MATLAB is an easy and intuitive tool for performing this analysis. This software computes the least square wavelet spec‐ trum (LSWS) for the series, with no need for preprocessing, transforming, or detrending. LSWA considers the correlations of the sinusoidal functions and constituents and the noise at the same time. We apply LSWA to the training set, with the results shown in Figure 9. The abscissa axis indicates the time scale used, while the ordinate axis shows the cy‐ clical frequencies (as 1/period). The level of variance explained is reflected by colors, ac‐ cording to the scale to the right of the graph. The first conclusion is clear from the graph: the one–hour seasonality remains prac‐ tically throughout the series as the predominant seasonality (with a percentage of variance greater than 90%), but discontinuously. In the sections where this does not occur, a series of sawtooth‐shaped formations stand out from the rest, although the percentage of vari‐ ance that it reflects does not exceed 30%. Some areas are shaded with more than 40% of the variation within high frequencies areas. This graph is shown in closer detail of in Fig‐ ure 10. period Appl. Sci. 2021, 11, 75 14 of 24 Figure 9. Least‐squares wavelet analysis (LSWA) applied to the training set of electricity consumption. Figure 10. Detail in 3D for the lowest cyclic frequencies in the LSWA analysis. We decided to use a 3D representation because it is then easier to appreciate the low‐ est cyclic frequencies. Between 14th November and 15th November and later between 19th November and 21st November, two frequencies with a percentage of variance of over 40% appear. This corresponds to a period of 100 time‐units. In the middle, between the two intervals, some peaks with 30% of the variance are also located. This corresponds to a periodicity of 200‐time units. The conclusion from this analysis is that there is clearly one seasonality that occurs every hour (ten‐time units), and a second pattern with unregular behavior over time and a periodic length that has been established at between 94 and 118 units. Although it is not strictly a seasonality, it can be modeled as a second seasonality. A marginal seasonality can be obtained for long cycles but will not be taken into account as it seems that its influ‐ ence on the time series is very small compared to the previous one. Appl. Sci. 2021, 11, 75 15 of 24 3.2. Application of Models with Regular Seasonality Given this disparity of values for the determination of the length of the seasonal pe‐ riods, we choose to carry out one analysis with the models using a single seasonality (ten‐ time units) and another using two seasonalities. For the models with regular seasonality, the second seasonality to be tested will be for a range of periods of between 90 and 110 time‐units. 3.2.1. Application of ANN One of the most powerful tools for working with neural networks is the MATLAB Deep Machine Learning toolbox. This toolbox includes a great variety of possibilities and different neural networks. From this range of possibilities, the NARX network and the NLR network are used. These two networks have been proven to be efficient in predicting future electricity demand values. The method of working with them is described in Figure 11. Here, it can be seen that it is first necessary to use the historical information about demand. Figure 11. Working scheme with neural networks using the Deep learning machine tool from MATLAB. To address the observed seasonality, the series is additionally given a delay, accord‐ ing to the seasonality. In this case, the seasonality of one hour corresponds to ten units of time of six minutes, so a delay of ten units is introduced to the series. Additional variables are added to the information provided by the time series. The exogenous information sup‐ plied to the model is: Previous hour’s average electricity consumption. Consumption of electricity from the previous hour. Timestamp (only in NARX model). The networks used are described in Table 1. The NARX model includes a single hid‐ den layer and a TDL of three time units. It was checked that greater TDL did not improve the results. NLR model also include a single hidden layer. Appl. Sci. 2021, 11, 75 16 of 24 Table 1. Neural network parameters and RMSE for fitted results. Fit Neural Network Parameters RMSE 1 input layer 1 hidden layer with 20 neurons NARX 33.95 1 output layer TDL = 3 1 input layer NLR 1 hidden layer with 20 neurons 33.21 1 output layer The training process is performed by minimizing the MSE and using the Levenberg‐ Marquardt algorithm. The training result is displayed as the RMSE in Table 1. In the same way as the first seasonality was introduced, the second seasonality is added, following what we have seen in Section 3.2. The result of adding a new seasonality does not improve the result. The chosen model has only one seasonality. 3.2.2. Application of ARIMA Models To apply the ARIMA models, MATLAB is the chosen platform, using Econometrics Toolbox and SSMMATLAB [71] for double seasonal ARIMA. R is also tested with the ‘forecast’ package, but the MATLAB results outperformed the R results. Like the previ‐ ous models, models with one and two seasonalities according to Section 3.2 are tested. The best results are found using a single seasonality of one hour (ten‐time units). The best model is ARIMA (4,2,4) × (4,1,4)10, for which the parameters are shown in Table 2. Table 2. ARIMA parameters and RMSE for fitted results. Parameters Fit RMSE 𝜙 −0.145; 𝜙 −0.472; AR 𝜙 = −0.190; 𝜙 0.170 𝜃 0.861; 𝜃 −0.601; MA 𝜃 0.443; 𝜃 = 0.174 50.42 Φ −0.487; Φ 0.067; SAR Φ −0.260; Φ −0.067 Θ 0.142; Θ 0.069; SMA Θ −0.292; Θ −0.065 3.2.3. Application of nHWT Models To perform the analysis using the nHWT models, a proprietary tool developed in MATLAB is used. This tool comes in the form of a toolbox, but it has not been published yet. Models with one and two seasonalities are tested. The results show that the model that best adapts to this series presents a single sea‐ sonality, with the parameters 𝛼 0.0001 , 𝛾 0.0001 , 𝛿 0.4856 and 𝜑 0.9286. The damping factor 𝜚 is set to 0. Models including this parameter were tested, but results were not improved, thus it was removed from the model. The RMSE of the fit process is 54.93. The result is not surprising since the nHWT models are structural and do not allow for a relaxation of seasonality. Once seasonality is established, the model will continue to develop the same pattern, even though this is not really reflected in the series. When using a single seasonality, the information provided by the second seasonal pattern is lost, but Appl. Sci. 2021, 11, 75 17 of 24 it has less influence than the error caused by a second seasonality that does not coincide with the real seasonality of the series. 3.2.4. Application of SSM Models To work with the state spaces, the ‘forecast’ library is used in R [72]. Models with a single seasonality are tested, as well as models that include several seasonalities as indi‐ cated in Section 3.1. Here again, the use of the trend damping parameter did not provide better results and was removed from the model. As in the previous cases, the models with several seasonalities do not show better results than the models with a single seasonality. Table 3 shows the models used—includ‐ ing their arguments—in the first column, the parameters obtained after the adjustment process in the second column, and the RMSE value of the adjustment in the third column. Table 3. SSM models, their parameters, and fit RMSE for fitted results. Model Arguments Parameters Fit RMSE 𝜆 0.717, 𝛼 0.120, 𝛾 0.206, 𝛿 − 0.004, BATS (0.717,0, 5,2, 10) 𝜙 = 0.945, 𝜙 − 0.625, 𝜙 0.022, 46.77 𝜙 0.104, 𝜙 − 0.500, 𝜃 − 0.153, 𝜃 0.515. 𝜆 0.756, 𝛼 0.862, 𝛾 0.105, 𝛿 − 0.0004, 𝛿 0.0001, TBATS (0.756,0, 5,2, 𝜙 = 1.293, 𝜙 – 0.552, 𝜙 − 0.359, 𝜙 0.227, 45.44 {10,1}) 𝜙 – 0.1968, 𝜃 = −1.358, 𝜃 0.783. 3.3. Application of Discrete Seasonality Models (nHWT‐DIMS) The application of discrete seasonality carries with it a differentiated strategy. The use of nHWT‐DIMS models makes it possible to model localized seasonality at specific instants of time, independently of other seasonality. In Figure 12, we show two different periods for the series. In addition to the seasonal pattern described at the beginning (of one hour), a new pattern can also be observed in Figure 12a, whose length is established at 27 time units (2 h and 42 min). This pattern is framed between two dashed lines includ‐ ing the demand peaks. Figure 12b shows another seasonal pattern that has the same length, but a different behavior. These two patterns will be called DIMS a and DIMS b. Figure 12. Discrete interval moving seasonality (DIMS) locations and recursion design. Two additional seasonal patterns are located, the first mostly appears in (a) while the second pattern appears in (b). Vertical dashed lines delimitate the DIMS period length. The appearance number of each DIMS is numbered at the bottom of the figure in red. Lines with arrows represent the recursivity of the DIMS, with full lines for the DIMS in (a) and dashed lines for the DIMS in (b). The time span of the previous appearance is shown in minutes over the line. Appl. Sci. 2021, 11, 75 18 of 24 The appearance of each discrete seasonality does not occur on a regular basis. This situation causes the recursion required in the Holt‐Winters models to be variable. This is indicated in Figure 12 by the lines with arrows. The solid lines indicate the recursion for the DIMS a, and the dashed lines indicate it for the DIMS b. The information regarding the DIMS is organized in Table 4. This table includes the locations of the discrete season‐ alities on every appearance (starting and ending time when the DIMS is defined, used in ∗ ∗ the variable 𝑡 ) and the associated recursivity in minutes, which corresponds to 𝑠 . As an example, the time interval when the second appearance of DIMS a is defined starts at 04:00 pm on the 14th and ends at 06:42 pm on the 14th. The recursivity 𝑠 during this interval is 618 min. Table 4. Location in the time series of the discrete seasonalities (DIMS) and their recursivity. The column Nr. indicates the order of appearance of the corresponding DIMS. ‘Time starts’ and ‘Time ends’ reflect the moving interval in which DIMS is defined, and ‘Recursivity’ shows the length of time since the previous appearance. DIMS Nr. Time starts Time ends Recursivity DIMS a 1 14th November at 05:42 am 14th November at 08:24 am –––––––– 2 14th at 04:00 pm 14th at 06:42 pm 618 min. 3 15th at 00:18 am 15th at 03:00 am 498 min 4 15th at 07:30 am 15th at 10:12 am 432 min 5 15th at 06:42 pm 15th at 09:24 pm 672 min 6 16th at 07:24 pm 16th at 10:06 pm 1482 min 7 17th at 10:42 am 17th at 01:24 pm 918 min 8 18th at 06:06 am 18th at 08:48 am 1164 min 9 19th at 02:30 am 19th at 05:12 am 1224 min 10 19th at 08:48 pm 19th at 23:30 pm 1098 min 11 21th at 06:06 pm 21th at 20:48 pm 2718 min DIMS b 1 16th November at 07:06 am 16th November at 09:48 am –––––––– 2 20th at 05:06 am 20th at 07:48 am 5640 min 3 20th at 05:24 pm 20th at 08:06 pm 738 min 4 21th at 02:36 am 21th at 05:18 am 552 min 5 22th at 02:24 am 22th at 05:06 am 1428 min The general model described by Equations (21)–(25) now results in the Equations shown in (27)–(32), with one seasonality of length ten time units and two DIMS as de‐ scribed in Table 4. 𝑙 𝛼 1𝛼 𝑙 𝑏 , (27) 𝐼 𝐷 𝐷 ∗ ∗ ∗ ∗ 𝑏 𝛾 𝑙 𝑙 1𝛾 𝑏 , (28) 𝑠 𝛿 1 𝛿 𝑠 , (29) 𝑙 𝐷 𝐷 ∗ ∗ ∗ ∗ 𝐷 ∗ 𝛿 1𝛿 𝐷 ∗ , (30) 𝑙 𝑠 𝐷 ∗ ∗ 𝐷 ∗ 𝛿 1𝛿 𝐷 ∗ ∗ , (31) 𝑙 𝑠 𝐷 ∗ ∗ Appl. Sci. 2021, 11, 75 19 of 24 𝑦 𝑘 𝑙 𝑘𝑏 𝑠 𝐷 𝐷 𝜀 . ∗ ∗ ∗ ∗ (32) Here, 𝑠 is the seasonal equation for the regular seasonality of ten‐time units with smoothing parameter 𝛿 . 𝐷 ∗ and 𝐷 ∗ are the DIMS as described in Table 4 with ∗ ∗ ∗ smoothing parameters 𝛿 and 𝛿 defined only in time 𝑡 and 𝑡 . The recursivity 𝑠 and 𝑠 is defined in Table 4. To use the model, the procedure described in [25] is carried out. Initially, the initial values for the level are obtained as the moving average of the first period of one hour; for the trend as the slope between the first and second cycle of one hour; and for the seasonal indices, the weighting of the series in the first cycle on the moving average. Subsequently, the seasonal indices of the DIMS are obtained. The time series is decomposed into its trend, seasonality, and irregular components using STL decomposition with the period length of one hour. From these components, the series is rebuilt, but without the irregular component being included. The seasonal indices are obtained by weighting the original series over the reconstructed one. Once the initial values of the model have been determined, the model is fitted by minimizing the RMSE, and the smoothing parameters are obtained. The tool for this anal‐ ysis is software developed in MATLAB (R) for this purpose. The obtained RMSE is 58.65. The smoothing parameters of the model obtained are 𝛼 0.0001, 𝛾 0.0001, 𝛿 0.4853 for the first regular seasonality, 𝛿 0.0005 for DIMS type a (see Figure 12a), 𝛿 0.0652 for DIMS type b (see Figure 12b) and 𝜑 0.9056. Here again, it has is de‐ cided not to use the damping parameter for trend. The RMSE of the fitted model is 58.65. 3.4. Model Fit Comparison A benchmark summary is reported in Table 5, where the RMSE in the fit process is summarized. The RMSE used to compare the models while fitting shows that the ANN fits better than the other models to the time series. The worst case seems to be nHWT‐ DIMS. The comparison shows that the state space models and the ARIMA models fit the observed data better than the nHWT and nHWT–DIMS models. Similar behavior is ex‐ pected in the forecasts. Table 5. Main benchmarking results. RMSE used to compare fitted results. MAPE used to com‐ pare forecasts accuracy. RMSE on Fit Average MAPE for Forecasts NARX 33.95 26.63% NN‐NLR 33.21 13.94% ARIMA 50.42 24.03% nHWT 54.93 18.55% TBATS 46.77 37.60% BATS 45.44 37.61% nHWT‐DIMS 58.65 16.00% 3.5. Forecasts Comparison The greatest interest is in forecast reliability. To compare the results of the forecasts given by the different methods, the mean absolute percentage error (MAPE) as a percent‐ age is used, as indicated in (33). This is a common indicator used to compare forecasts of demand [73]. 1 𝑦 𝑦 MAPE ℎ . (33) ℎ 𝑦 𝜑 Appl. Sci. 2021, 11, 75 20 of 24 Here, h is the forecast horizon to evaluate. As the forecasts are made one hour ahead (ten units) throughout the validation subset, h can take values from one to ten time units. From the forecasts of one hour ahead, the MAPE is obtained by comparing these with the real values of the validation set, using the procedure described in [74]. The benchmark summary in Table 5 includes the average of the MAPE. The average is obtained as the mean of the MAPE(h) with h = 1,2,…,10. The best forecasts, on average, are produced by the NLR. The nHWT‐DIMS models are revealed as a competitive method against the reg‐ ular seasonal models, outperforming the other models. Figure 13 shows the MAPE of these forecasts as a function of the forecasting horizon. It is clear from the results obtained that traditional models with one seasonality are not capable of working with this type of series. The BATS and TBATS models of state spaces do not drop below 30% MAPE. The ARIMA model starts by making very short‐term fore‐ casts that have MAPE of below 15%, but beyond three time units it is not capable of mak‐ ing good predictions. The nHWT models improve the forecasts with respect to the previ‐ ous ones, although the use of the DIMS allows the level of the predictions to be always kept below 20%. However, the method that produces the best results is NN–NLR. These models give forecasts that remain almost constant with an accuracy of about 14% of MAPE. ARIMA NLR NN nHWT nHWT-DIMS TBATS 35 BATS NARX NN 6 1218243036 42485460 Forecast horizon (minutes) Figure 13. Mean absolute percentage error (MAPE) comparison of one hour‐ahead forecasts. 4. Discussion The results obtained in the previous exercise show that the fact that having irregu‐ larities in the series has an enormous influence on the result in statistical models used in this article. The models that use regular seasonalities require that they appear with a reg‐ ular length, regardless of whether the pattern varies. When dealing with series whose seasonality is not completely defined, the models cannot overcome these variations. The use of models with discrete seasonality allows for progress in this problem, since it is ca‐ pable of introducing seasonality only where it occurs. Though the periodicity of 200‐time units did not show a consistent pattern over time in this data set, having a longer time series more than seven days (e.g., two‐month record or more) may reveal discontinuous patterns repeating themselves at that low frequency, which may help to better train the model for forecasting such signals. This requires further investigation and research. MAPE(%) Appl. Sci. 2021, 11, 75 21 of 24 However, the best tool for this series is the use of AI to address these irregularities. Curiously, the NARX neural network does not offer good results, but the NLR neural net‐ work manages to improve the results. This situation responds to the fact that the previ‐ ously described models require seasonal stability if they are to make predictions, since they are based on linear or structural models. The neural network model is not subject to these restrictions and uses these irregularities to make forecasts. Future studies in this area should aim to ensure that the structural models are capable of introducing an ambiguity between their seasonal processes produced by the incon‐ sistency of the series in terms of seasonality. 5. Conclusions In this article, we have analyzed time series forecasting methods applied to a pattern of electricity demand that has an irregular periodicity, so that the seasonality is not well defined. We have analyzed models of neural networks, ARIMA, multiple seasonal Holt‐ Winters models and state spaces using regular seasonalities, and multiple seasonal Holt‐ Winters models with discrete interval moving seasonalities. To compare the behavior of all the models discussed, they were applied to the situa‐ tion of a connection node with a hot‐dip galvanizing company, where the time series of electricity consumption due to the heating of the bath causes seasonalities. A frequency analysis using spectral density and least square wavelets with the series showed that a first seasonality of one hour could be easily located; some other seasonalities could be considered, but their period was not clear. The problem with irregular seasonality is that the models need to use patterns that constantly repeat themselves, so the pattern must be defined for the entire time series. Nevertheless, the use of Holt‐Winters models with dis‐ crete seasonality (nHWT‐DIMS) allows these seasonalities to be separated efficiently and reliable predictions to be made. The results showed that the use of nHWT–DIMS models improves the results com‐ pared to the rest of the models. This is an interesting proposal for companies because of the simplicity of its application and good results—the MAPE obtained is around 16%. However, NLR (ANN) showed better predictions, with a MAPE of 14%. Our study contributes to the improvement of forecasting systems with time series by including discrete seasonality in the model. This allows for an efficient method of predic‐ tion to be applied in situations of electrical demand with marked seasonality but non– regular periodic appearances. Author Contributions: Conceptualization, O.T. and J.C.G.‐D.; methodology, O.T. and J.C.G.‐D.; software, O.T.; validation, J.C.G.‐D. and A.P.‐S.; formal analysis, A.P.‐S.; investigation, O.T.; data curation, J.C.G.‐D.; writing—original draft preparation, O.T.; writing—review and editing, J.C.G.‐ D. and A.P.‐S.; supervision, J.C.G.‐D. All authors have read and agreed to the published version of the manuscript. Funding: This research received no external funding. Acknowledgments: The authors would like to thank the editor and the anonymous referees for their thorough comments, deep analysis and suggestions. Conflicts of Interest: The authors declare no conflict of interest. Appl. Sci. 2021, 11, 75 22 of 24 Abbreviations AI Artificial intelligence Al Aluminum AIC, AICc Akaike’s information criterion ALLSSA Anti‐leakage least‐squares spectral analysis ANN Artificial neural networks AR(1) Auto regressive model of order 1 ARIMA Autoregressive integrated moving average ARMA Autoregressive moving average Exponential smoothing state space model with Box‐Cox transformation, ARMA BATS errors, trend and seasonal components BIC Bayesian information criterion BRT Bagged regression trees DIMS Discrete interval moving seasonalities LASSO Least absolute shrinkage and selection operator LSWA Least‐squares wavelet analysis LSWS Least square wavelet spectrum MSE Mean squared error MAPE Mean absolut percentage error MLP Multilayer perceptron NARX Non‐linear autoregressive neural networks with exogenous variables nHWT Multiple seasonal Holt‐Winters nHWT‐ Multiple seasonal Holt‐Winters with discrete interval moving seasonalities DIMS NLR Non‐linear regression RMSE Root of mean squared error SARIMAX Seasonal autoregressive integrated moving average exogenous model SIC Schwarz’s information criterion SSM State‐space models STL Seasonal–trend decomposition procedure using Loess SVM Support vector machines Exponential smoothing state space model with Box‐Cox transformation, ARMA TBATS errors, trend and trigonometric seasonal components TDL Tapped delay line Zn Zinc References 1. Weron, R. Modeling and Forecasting Electricity Loads and Prices: A Statistical Approach; John Wiley & Sons, Ltd.: Chichester, UK, 2006; ISBN 978‐0‐470‐05753‐7. 2. Machine Learning. An Artificial Intelligence Approach; Michalski, R.S., Carbonell, J.G., Mitchell, T.M., Eds.; Morgan Kaufmann: San Francisco, CA, USA, 1983; ISBN 978‐0‐08‐051054‐5. 3. Wang, H.; Lei, Z.; Zhang, X.; Zhou, B.; Peng, J. A review of deep learning for renewable energy forecasting. Energy Convers. Manag. 2019, 198, 111799. 4. Fallah, S.; Ganjkhani, M.; Shamshirband, S.; Chau, K.; Fallah, S.N.; Ganjkhani, M.; Shamshirband, S.; Chau, K. Computational Intelligence on Short–Term Load Forecasting: A Methodological Overview. Energies 2019, 12, 393. 5. Buitrago, J.; Asfour, S. Short–term forecasting of electric loads using nonlinear autoregressive artificial neural networks with exogenous vector inputs. Energies 2017, 10, 40. 6. López, M.; Valero, S.; Senabre, C. Short–term load forecasting of multiregion systems using mixed effects models. In Proceed‐ ings of the 2017 14th International Conference on the European Energy Market (EEM), Dresden, Germany, 6–9 June 2017; pp. 1–5. 7. Ahmad, W.; Ayub, N.; Ali, T.; Irfan, M.; Awais, M.; Shiraz, M.; Glowacz, A. Towards short term electricity load forecasting using improved support vector machine and extreme learning machine. Energies 2020, 13, 2907. 8. Khan, A.R.; Razzaq, S.; Alquthami, T.; Moghal, M.R.; Amin, A.; Mahmood, A. Day ahead load forecasting for IESCO using Artificial Neural Network and Bagged Regression Tree. In Proceedings of the 2018 1st International Conference on Power, Energy and Smart Grid (ICPESG), Mirpur, Pakistan, 12–13 April 2018; pp. 1–6. 9. Zahedi, G.; Azizi, S.; Bahadori, A.; Elkamel, A.; Wan Alwi, S.R. Electricity demand estimation using an adaptive neuro–fuzzy network: A case study from the Ontario province—Canada. Energy 2013, 49, 323–328. Appl. Sci. 2021, 11, 75 23 of 24 10. Box, G.E.P.; Jenkins, G.M.; Reinsel, G.C.; Ljung, G.M. Time Series Analysis: Forecasting and Control, 5th ed.; John Wiley & Sons: Hoboken, NJ, USA, 2015; ISBN 978‐1‐118‐67502‐01. 11. Makridakis, S.; Hibon, M. ARMA models and the Box–Jenkins methodology. J. Forecast. 1997, 16, 147–163. 12. Cancelo, J.R.; Espasa, A.; Grafe, R. Forecasting the electricity load from one day to one week ahead for the Spanish system operator. Int. J. Forecast. 2008, 24, 588–602. 13. Elamin, N.; Fukushige, M. Modeling and forecasting hourly electricity demand by SARIMAX with interactions. Energy 2018, 165, 257–268. 14. Bercu, S.; Proïa, F. A SARIMAX coupled modelling applied to individual load curves intraday forecasting. J. Appl. Stat. 2013, 40, 1333–1348. 15. Taylor, J.W.; de Menezes, L.M.; McSharry, P.E. A comparison of univariate methods for forecasting electricity demand up to a day ahead. Int. J. Forecast. 2006, 22, 1–16. 16. Hyndman, R.J.; Koehler, A.B.; Ord, J.K.; Snyder, R.D. Forecasting with Exponential Smoothing: The State Space Approach; Springer: Berlin/Heidelberg, Germany, 2008; ISBN 978‐3‐540‐71916‐8. 17. Welch, G.; Bishop, G. An Introduction to the Kalman Filter. Proc. Siggraph Course 2006, 7, 1–16. 18. De Livera, A.M.; Hyndman, R.J.; Snyder, R.D. Forecasting time series with complex seasonal patterns using exponential smooth‐ ing. J. Am. Stat. Assoc. 2011, 106, 1513–1527. 19. Bermúdez, J.D. Exponential smoothing with covariates applied to electricity demand forecast. Eur. J. Ind. Eng. 2013, 7, 333–349. 20. Winters, P.R. Forecasting sales by exponentially weighted moving averages. Management 1960, 6, 324–342. 21. Taylor, J.W.; McSharry, P.E. Short–term load forecasting methods: An evaluation based on European data. Power Syst. IEEE Trans. 2007, 22, 2213–2219. 22. Taylor, J.W. Short–term electricity demand forecasting using double seasonal exponential smoothing. J. Oper. Res. Soc. 2003, 54, 799–805. 23. Taylor, J.W. Triple seasonal methods for short–term electricity demand forecasting. Eur. J. Oper. Res. 2010, 204, 139–152. 24. García–Díaz, J.C.; Trull, O. Competitive Models for the Spanish Short–Term Electricity Demand Forecasting. In Time Series Analysis and Forecasting: Selected Contributions from the ITISE Conference; Rojas, I., Pomares, H., Eds.; Springer International Pub‐ lishing: Cham, Switzerland, 2016; pp. 217–231, ISBN 978‐3‐319‐28725‐6. 25. Trull, O.; García–Díaz, J.C.; Troncoso, A. Application of Discrete–Interval Moving Seasonalities to Spanish Electricity Demand Forecasting during Easter. Energies 2019, 12, 1083. 26. Hong, T.; Xie, J.; Black, J. Global energy forecasting competition 2017: Hierarchical probabilistic load forecasting. Int. J. Forecast. 2019, 35, 1389–1399. 27. Makridakis, S.; Spiliotis, E.; Assimakopoulos, V. The M4 Competition: Results, findings, conclusion and way forward. Int. J. Forecast. 2018, 34, 802–808. 28. Smyl, S. A hybrid method of exponential smoothing and recurrent neural networks for time series forecasting. Int. J. Forecast. 2020, 36, 75–85. 29. Choi, T.‐M.; Yu, Y.; Au, K.‐F. A hybrid SARIMA wavelet transform method for sales forecasting. Decis. Support Syst. 2011, 51, 130–140. 30. Sudheer, G.; Suseelatha, A. Short term load forecasting using wavelet transform combined with Holt–Winters and weighted nearest neighbor models. Int. J. Electr. Power Energy Syst. 2015, 64, 340–346. 31. Fard, A.K.; Akbari‐Zadeh, M.‐R. A hybrid method based on wavelet, ANN and ARIMA model for short‐term load forecasting. J. Exp. Theor. Artif. Intell. 2014, 26, 167–182. 32. Tibshirani, R. Regression shrinkage and selection via the lasso: A retrospective. J. R. Stat. Soc. Ser. B Stat. Methodol. 2011, 73, 273– 282. 33. Dudek, G. Pattern–based local linear regression models for short‐term load forecasting. Electr. Power Syst. Res. 2016, 130, doi:10.1016/J.EPSR.2015.09.001. 34. Bickel, P.J.; Gel, Y.R. Banded regularization of autocovariance matrices in application to parameter estimation and forecasting of time series. J. R. Stat. Soc. Ser. B Stat. Methodol. 2011, 73, 711–728. 35. Ghaderpour, E.; Vujadinovic, T. The potential of the least‐squares spectral and cross‐wavelet analyses for near‐real‐time dis‐ turbance detection within unequally spaced satellite image time series. Remote Sens. 2020, 12, 2446 36. Ghaderpour, E.; Pagiatakis, S.D. Least‐Squares Wavelet Analysis of Unequally Spaced and Non‐stationary Time Series and Its Applications. Math. Geosci. 2017, 49, 819–844. 37. Taylor, J.W.; Buizza, R. Using weather ensemble predictions in electricity demand forecasting. Int. J. Forecast. 2003, 19, 57–70. 38. Shibli, S.M.A.; Meena, B.N.; Remya, R. A review on recent approaches in the field of hot dip zinc galvanizing process. Surf. Coat. Technol. 2015, 262, 210–215. 39. Bush, G.W. Developments in the continuous galvanizing of steel. JOM 1989, 41, 34–36. 40. Debón, A.; García‐Díaz, J.C. Fault diagnosis and comparing risk for the steel coil manufacturing process using statistical models for binary data. Reliab. Eng. Syst. Saf. 2012, 100, 102–114. 41. García‐Díaz, J.C. Fault detection and diagnosis in monitoring a hot dip galvanizing line using multivariate statistical process control. Saf. Reliab. Risk Anal. Theory Methods Appl. 2009, 1, 201–204. 42. Ajersch, F.; Ilinca, F.; Hétu, J.F. Simulation of flow in a continuous galvanizing bath: Part II. Transient aluminum distribution resulting from ingot addition. Metall. Mater. Trans. B 2004, 35, 171–178. Appl. Sci. 2021, 11, 75 24 of 24 43. Tang, N.Y. Characteristics of continuous‐galvanizing baths. Metall. Mater. Trans. B 1999, 30, 144–148. 44. Hippert, H.S.; Pedreira, C.E.; Souza, R.C. Neural networks for short‐term load forecasting: A review and evaluation. IEEE Trans. Power Syst. 2001, 16, 44–55. 45. Baliyan, A.; Gaurav, K.; Kumar Mishra, S. A review of short term load forecasting using artificial neural network models. Pro‐ cedia Comput. Sci. 2015, 48, 121–125. 46. Xie, H.; Tang, H.; Liao, Y.H. Time series prediction based on narx neural networks: An advanced approach. Proc. Int. Conf. Mach. Learn. Cybern. 2009, 3, 1275–1279. 47. Liu, Q.; Chen, W.; Hu, H.; Zhu, Q.; Xie, Z. An Optimal NARX Neural Network Identification Model for a Magnetorheological Damper with Force–Distortion Behavior. Front. Mater. 2020, 7, 1–12. 48. Deoras, A. Electricity Load and Price Forecasting Webinar Case Study. Available online: https://es.mathworks.com/matlabcen‐ tral/fileexchange/28684–electricity–load–and–price–forecasting–webinar–case–study (accessed on 6 July 2020). 49. Moré, J.J. The Levenberg–Marquardt algorithm: Implementation and theory. In Numerical Analysis. Notes in Mathematics; Wat‐ son, G.A., Ed.; Springer: Berlin/Heidelberg, Germany, 1978; Volume 630, pp. 105–116, ISBN 978‐3‐540‐08538‐6. 50. Box, G.E.P.; Jenkins, G.M. Time Series Analysis: Forecasting and Control; Holden‐Day: San Francisco, CA, USA, 1970. 51. Lu, Y.; AbouRizk, S.M. Automated Box–Jenkins forecasting modelling. Autom. Constr. 2009, 18, 547–558. 52. Brockwell, P.J.; Davis, R.A.; Fienberg, S.E. Time Series: Theory and Methods: Theory and Methods, 2nd ed.; Springer Science & Business Media: New York, NY, USA, 1991; ISBN 0387974296. 53. Holt, C.C. Forecasting Seasonals and Trends by Exponentially Weighted Averages; Carnegie Institute of Technology, Graduate school of Industrial Administration.: Pittsburgh, PA, USA, 1957. 54. Brown, R.G. Statistical Forecasting for Inventory Control; McGraw‐Hill: New York, NY, USA, 1959. 55. Gardner, E.S., Jr.; McKenzie, E. Forecasting Trends in Time Series. Manag. Sci. 1985, 31, 1237–1246. 56. Gardner, E.S., Jr.; McKenzie, E. Why the damped trend works. J. Oper. Res. Soc. 2011, 62, 1177–1180. 57. Chatfield, C. The Holt–Winters forecasting procedure. Appl. Stat. 1978, 27, 264–279. 58. Gardner, E.S., Jr. Exponential smoothing: The state of the art—Part II. Int. J. Forecast. 2006, 22, 637–666. 59. Segura, J.V.; Vercher, E. A spreadsheet modeling approach to the Holt–Winters optimal forecasting. Eur. J. Oper. Res. 2001, 131, 375–388. 60. Bermúdez, J.D.; Segura, J.V.; Vercher, E. Improving demand forecasting accuracy using nonlinear programming software. J. Oper. Res. Soc. 2006, 57, 94–100. 61. Nelder, J.A.; Mead, R. A Simplex Method for Function Minimization. Comput. J. 1965, 7, 308–313. 62. Lagarias, J.C.; Reeds, J.A.; Wright, M.H.; Wright, P.E. Convergence properties of the nelder–mead simplex method in low di‐ mensions. SIAM J. Optim. 1998, 9, 112–147. 63. Koller, D.; Friedman, N. Probabilistic Graphical Models: Principles and Techniques; MIT Press: Cambridge, MA, USA, 2009; ISBN 978‐0262013192. 64. Durbin, J.; Koopman, S.J. A simple and efficient simulation smoother for state space time series analysis. Biometrika 2002, 89, 603–615. 65. Trull, O.; García–Díaz, J.C.; Troncoso, A. Stability of multiple seasonal holt–winters models applied to hourly electricity demand in Spain. Appl. Sci. 2020, 10, 1–16. 66. Cleveland, R.B.; Cleveland, W.S.; McRae, J.E.; Terpenning, I. STL: A seasonal–trend decomposition procedure based on loess. J. Off. Stat. 1990, 6, 3–73. 67. Fan, J.; Yao, Q. Characteristics of Time Series. In Nonlinear Time Series. Nonparametric and Parametric Methods; Springer: New York, NY, USA, 2003; ISBN 978‐0‐387‐26142‐3. 68. Alessio, S.M. Digital Signal Processing and Spectral Analysis for Scientists; Springer: Cham, Switzerland, 2016; ISBN 978‐3‐319‐ 25466‐1. 69. Ghaderpour, E.; Ince, E.S.; Pagiatakis, S.D. Least‐squares cross‐wavelet analysis and its applications in geophysical time series. J. Geod. 2018, 92, 1223–1236. 70. Ghaderpour, E.; Pagiatakis, S.D. LSWAVE: A MATLAB software for the least‐squares wavelet and cross‐wavelet analyses. GPS Solut. 2019, 23, 1–8. 71. Gómez, V. SSMMATLAB: A Set of MATLAB Programs for the Statistical Analysis of State Space Models. J. Stat. Softw. 2015, 66, 1–37. 72. Hyndman, R.J.; Khandakar, Y. Automatic time series forecasting: The forecast package for R. J. Stat. Softw. 2008, 27, doi:10.18637/jss.v027.i03. 73. Kim, S.; Kim, H. A new metric of absolute percentage error for intermittent demand forecasts. Int. J. Forecast. 2016, 32, 669–679. 74. Hyndman, R.J.; Athanasopoulos, G. Forecasting: Principles and Practice; 2nd ed.; OTexts: Melbourne, Australia, 2018; Chapter 3.4; ISBN 978‐0‐9875071‐1‐2.
Applied Sciences – Multidisciplinary Digital Publishing Institute
Published: Dec 23, 2020
You can share this free article with as many people as you like with the url below! We hope you enjoy this feature!
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.