Combinations of forecasts were introduced by Bates and Granger (1969) and it is a very common way to improve the forecasting accuracy. The main idea of combining forecasts lies in the fact that different forecasting methods contain useful and independent information. According to Armstrong (2001) forecasting practice has proved evidence about the improvement of forecasting accuracy through combining individual forecasts. Moreover, combining forecasts has been very useful when it is difficult to select the most accurate forecasting method.
This toolbox includes six well-known quantitative methods to combine individual forecasts. These are:
1) Inverse Proportion to an error measure (Bates and Granger, 1969). The weights are estimated according to the inverse proportion of an accuracy measure of an individual forecast, divided by the sum of the inverse proportion of the accuracy measure of all the forecasts. The available accuracy measures are either the mean or the variance of Squared Error, Absolute Error, Absolute Percentage Error, or Absolute Relative Error (i.e. the absolute error of each data point over the absolute error of the best individual forecast on the same data-point).
2) Inverse Rank (Aiolfi and Timmermann, 2006). Individual forecasts are ranked according their performance. The model with the lowest error measure is ranked 1, etc.
3) Point Performance based on an error measure (Panagiotopoulos, 2012). The weights are assigned according to the number of times a technique gives the minimum error measure. The available accuracy measures are Squared Error, Absolute Error, Absolute Percentage Error, or Absolute Relative Error.
4) Information Criterion (Kolassa, 2011). Either the Akaike Information Criterion or the Bayesian Information Criterion.
5) Constrained Linear Least Squares (Gunter, 1992).
6) Linear Programming (Reeves and Lawrence, 1991, Lam et al., 2001, Panagiotopoulos, 2012). The objectives are:
a) Single objective linear programming that minimizes a 'sum of errors' index. Either the Sum of Absolute Errors, Sum of Absolute Percentage Errors, or Sum of Absolute Relative Errors.
b) Single objective linear programming that minimizes a 'maximum error' index. Either the Maximum Absolute Error, Maximum Absolute Percentage Error, or the Maximum Absolute Relative Error.
c) Weighted goal linear programming (WGP) that minimizes both a 'maximum error' and a 'sum of errors' indices (MinMax-MinSum approach).
The toolbox includes a function for each of the combination methods a function, which includes all six of them, and a function that combines individual forecasts with predetermined weights or through averaging.
Armstrong J. S. (2001). Principles of Forecasting, Kluwer Academic Publishers, Massachusetts
Aiolfi M. and A. Timmermann (2006). Persistence in forecasting performance and conditional combination strategies. Journal of Econometrics, 135(1):31–53, 2006.
Bates J. M. and C. W. J. Granger (1969). The combination of forecasts. Journal of the Operational Research Society, 20(4), pp. 451 – 468.
Gunter S. I. (1992). Nonnegativity restricted least squares combinations, International Journal of Forecasting, 8(1), pp. 45 – 59.
Kolassa S. (2011). Combining exponential smoothing forecasts using Akaike weights. International Journal of Forecasting, 27(2), pp. 238 – 251.
Lam K. F., H. W. Mui and H. K. Yuen (2001). A note of minimising absolute percentage error in combined forecasts. Computers and Operations Research, 28(11), pp. 1141 – 1147.
Panagiotopoulos A. (2012) Optimising time series forecasts through linear programming. PhD thesis, University of Nottingham.
Reeves G. R. and K. D. Lawrence (1991). Combining forecasts given different types of objectives. European Journal of Operational Research, 51(1), pp. 65 – 72.