On using predictive-ability tests in the selection of time-series prediction models: A Monte Carlo evaluation

Costantini, Mauro and Kunst, Robert M.ORCID: https://orcid.org/0000-0001-6831-2471 (2020) On using predictive-ability tests in the selection of time-series prediction models: A Monte Carlo evaluation. International Journal of Forecasting, 37 (2), pp. 445-460. https://doi.org/10.1016/j.ijforecast.2020.06.010

Full text not available from this repository. (Request a copy)


To select a forecast model among competing models, researchers often use ex-ante prediction experiments over training samples. Following Diebold and Mariano (1995), forecasters routinely evaluate the relative performance of competing models with accuracy tests and may base their selection on test significance on top of comparing forecast errors. With extensive Monte Carlo analysis, we investigated whether this practice favors simpler models over more complex ones, without gains in forecast accuracy. We simulated the autoregressive moving-average model, the self-exciting threshold autoregressive model, and vector autoregression. We considered two variants of the Diebold–Mariano test, the test by Giacomini and White (2006), the F-test by Clark and McCracken (2001), the Akaike information criterion, and a pure training-sample evaluation. The findings showed some accuracy gains for small samples when applying accuracy tests, particularly for the Clark–McCracken and bootstrapped Diebold–Mariano tests. Evidence against this testing procedure dominated, however, and training-sample evaluations without accuracy tests performed best in many cases.

Item Type: Article in Academic Journal
Keywords: Forecasting, Time series, Predictive accuracy, Model selection, Monte Carlo simulation
Research Units: Current Research Groups > Macroeconomics and Business Cycles
Date Deposited: 21 Jan 2021 10:36
Last Modified: 11 Mar 2021 12:46
DOI: 10.1016/j.ijforecast.2020.06.010
ISSN: 0169-2070
URI: https://irihs.ihs.ac.at/id/eprint/5636

Actions (login required)

View Item View Item