nep-for New Economics Papers
on Forecasting
Issue of 2012‒03‒14
eleven papers chosen by
Rob J Hyndman
Monash University

  1. Oil price forecasting under asymmetric loss By Pierdzioch, Christian; Rülke, Jan-Christoph; Stadtmann, Georg
  2. Predicting Time-Varying Parameters with Parameter-Driven and Observation-Driven Models By Siem Jan Koopman; Andre Lucas; Marcel Scharth
  3. Forecasting Value-at-Risk Using Block Structure Multivariate Stochastic Volatility Models By Manabu Asai; Massimiliano Caporin; Michael McAleer
  4. Forecasting Value-at-Risk Using Block Structure Multivariate Stochastic Volatility Models By Manabu Asai; Massimiliano Caporin; Michael McAleer
  5. Forecasting Mixed Frequency Time Series with ECM-MIDAS Models By Götz Thomas; Hecq Alain; Urbain Jean-Pierre
  6. Modelling Changes in the Unconditional Variance of Long Stock Return Series By Cristina Amado; Timo Teräsvirta
  7. The Value of Multivariate Model Sophistication: An Application to pricing Dow Jones Industrial Average Options By Jeroen Rombouts; Lars Peter Stentoft; Francesco Violente
  8. Towards a benchmark on the contribution of education and training to employability: methodological note By Garrouste, Christelle
  9. Growth in Emerging Market Economies and the Commodity Boom of 2003–2008: Evidence from Growth Forecast Revisions By Elif C. Arbatli; Garima Vasishtha
  10. Testing for predictability in a noninvertible ARMA model By Lanne, Markku; Meitz, Mika; Saikkonen, Pentti
  11. Improving Classifier Performance Assessment of Credit Scoring Models By Raffaella Calabrese

  1. By: Pierdzioch, Christian; Rülke, Jan-Christoph; Stadtmann, Georg
    Abstract: Based on the approach advanced by Elliott et al. (Rev. Ec. Studies. 72, 1197-1125), we found that the loss function of a sample of oil price forecasters is asymmetric in the forecast error. Our findings indicate that the loss oil price forecasters incurred when their forecasts exceeded the price of oil tended to be larger than the loss they incurred when their forecast fell short of the price of oil. Accounting for the asymmetry of the loss function does not necessarily make forecasts look rational. --
    Keywords: oil price,forecasting,loss function,rationality of forecasts
    JEL: F31 D84
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:zbw:euvwdp:314&r=for
  2. By: Siem Jan Koopman (VU University Amsterdam); Andre Lucas (VU University Amsterdam); Marcel Scharth (VU University Amsterdam)
    Abstract: We study whether and when parameter-driven time-varying parameter models lead to forecasting gains over observation-driven models. We consider dynamic count, intensity, duration, volatility and copula models, including new specifications that have not been studied earlier in the literature. In an extensive Monte Carlo study, we find that observation-driven generalised autoregressive score (GAS) models have similar predictive accuracy to correctly specified parameter-driven models. In most cases, differences in mean squared errors are smaller than 1% and model confidence sets have low power when comparing these two alternatives. We also find that GAS models outperform many familiar observation-driven models in terms of forecasting accuracy. The results point to a class of observation-driven models with comparable forecasting ability to parameter-driven models, but lower computational complexity.
    Keywords: Generalised autoregressive score model; Importance sampling; Model confidence set; Nonlinear state space model; Weibull-gamma mixture
    JEL: C53 C58 C22
    Date: 2012–03–06
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20120020&r=for
  3. By: Manabu Asai; Massimiliano Caporin; Michael McAleer (University of Canterbury)
    Abstract: Most multivariate variance or volatility models suffer from a common problem, the “curse of dimensionality”. For this reason, most are fitted under strong parametric restrictions that reduce the interpretation and flexibility of the models. Recently, the literature has focused on multivariate models with milder restrictions, whose purpose was to combine the need for interpretability and efficiency faced by model users with the computational problems that may emerge when the number of assets is quite large. We contribute to this strand of the literature proposing a block-type parameterization for multivariate stochastic volatility models. The empirical analysis on stock returns on US market shows that 1% and 5 % Value-at-Risk thresholds based on one-step-ahead forecasts of covariances by the new specification are satisfactory for the period includes the global financial crisis.
    Keywords: block structures; multivariate stochastic volatility; curse of dimensionality; leverage effects; multi-factors; heavy-tailed distribution
    JEL: C32 C51 C10
    Date: 2012–03–01
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:12/04&r=for
  4. By: Manabu Asai (Soka University / Faculty of Economics); Massimiliano Caporin (Department of Economics and Management “Marco Fanno” University of Padova, Italy.); Michael McAleer (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam and Tinbergen Institute, The Netherlands, Department of Quantitative Economics, Complutense University of Madrid, and Institute of Economic Research, Kyoto University)
    Abstract: Most multivariate variance or volatility models suffer from a common problem, the “curse of dimensionality”. For this reason, most are fitted under strong parametric restrictions that reduce the interpretation and flexibility of the models. Recently, the literature has focused on multivariate models with milder restrictions, whose purpose was to combine the need for interpretability and efficiency faced by model users with the computational problems that may emerge when the number of assets is quite large. We contribute to this strand of the literature proposing a block-type parameterization for multivariate stochastic volatility models. The empirical analysis on stock returns on US market shows that 1% and 5 % Value-at-Risk thresholds based on one-step-ahead forecasts of covariances by the new specification are satisfactory for the period includes the global financial crisis.
    Keywords: block structures; multivariate stochastic volatility; curse of dimensionality; leverage effects; multi-factors; heavy-tailed distribution.
    JEL: C32 C51 C10
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:ucm:doicae:1203&r=for
  5. By: Götz Thomas; Hecq Alain; Urbain Jean-Pierre (METEOR)
    Abstract: This paper proposes a mixed-frequency error-correction model in order to develop a regressionapproach for non-stationary variables sampled at different frequencies that are possiblycointegrated. We show that, at the model representation level, the choice of the timing betweenthe low-frequency ependent and the high-frequency explanatory variables to be included in thelong-run has an impact on the remaining dynamics and on the forecasting properties. Then, wecompare in a set of Monte Carlo experiments the forecasting performances of the low-frequencyaggregated model and several mixed-frequency regressions. In particular, we look at both theunrestricted mixed-frequency model and at a more parsimonious MIDAS regression. Whilst theexisting literature has only investigated the potential improvements of the MIDAS framework forstationary time series, our study emphasizes the need to include the relevant cointegratingvectors in the non-stationary case. Furthermore, it is illustrated that the exact timing of thelong-run relationship does notmatter as long as the short-run dynamics are adapted according to the composition of thedisequilibrium error. Finally, the unrestricted model is shown to suffer from parameterproliferation for small sample sizeswhereas MIDAS forecasts are robust to over-parameterization. Hence, the data-driven,low-dimensional and flexible weighting structure makes MIDAS a robust and parsimonious method tofollow when the true underlying DGP is unknown while still exploiting information present in thehigh-frequency. An empirical application illustrates the theoretical and the Monte Carlo results.
    Keywords: econometrics;
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:dgr:umamet:2012012&r=for
  6. By: Cristina Amado (University of Minho and NIPE); Timo Teräsvirta (Aarhus University, School of Economics and Management and CREATES)
    Abstract: In this paper we develop a testing and modelling procedure for describing the long-term volatility movements over very long return series. For the purpose, we assume that volatility is multiplicatively decomposed into a conditional and an unconditional component as in Amado and Teräsvirta (2011). The latter component is modelled by incorporating smooth changes so that the unconditional variance is allowed to evolve slowly over time. Statistical inference is used for specifying the parameterization of the time-varying component by applying a sequence of Lagrange multiplier tests. The model building procedure is illustrated with an application to daily returns of the Dow Jones Industrial Average stock index covering a period of more than ninety years. The main conclusions are as follows. First, the LM tests strongly reject the assumption of constancy of the unconditional variance. Second, the results show that the long-memory property in volatility may be explained by ignored changes in the unconditional variance of the long series. Finally, based on a formal statistical test we find evidence of the superiority of volatility forecast accuracy of the new model over the GJR-GARCH model at all horizons for a subset of the long return series.
    Keywords: Model specification; Conditional heteroskedasticity; Lagrange multiplier test; Timevarying unconditional variance; Long financial time series; Volatility persistence.
    JEL: C12 C22 C51 C52 C53
    Date: 2012–02–28
    URL: http://d.repec.org/n?u=RePEc:aah:create:2012-07&r=for
  7. By: Jeroen Rombouts; Lars Peter Stentoft; Francesco Violente
    Abstract: We assess the predictive accuracy of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set 248 multivariate models that differ in their specification of the conditional variance, conditional correlation, and innovation distribution. All models belong to the dynamic conditional correlation class which is particularly suited because it allows to consistently estimate the risk neutral dynamics with a manageable computational effort in relatively large scale problems. It turns out that the most important gain in pricing accuracy comes from increasing the sophistication in the marginal variance processes (i.e. nonlinearity, asymmetry and component structure). Enriching the model with more complex correlation models, and relaxing a Gaussian innovation for a Laplace innovation assumption improves the pricing in a smaller way. Apart from investigating directly the value of model sophistication in terms of dollar losses, we also use the model confidence set approach to statistically infer the set of models that delivers the best pricing performance. <P>
    Keywords: Option pricing, economic loss, forecasting, multivariate GARCH, model confidence set,
    JEL: C10 C32 C51 C52 C53 G10
    Date: 2012–02–01
    URL: http://d.repec.org/n?u=RePEc:cir:cirwor:2012s-05&r=for
  8. By: Garrouste, Christelle
    Abstract: The present report presents the methodological framework applied to define the benchmark on education for employability to be proposed to European Council in 2012. While the three first sections present the proposed indicator, the fourth section discusses the sensitivity of that indicator to a change in data source and its correlation with counterfactuals. Moreover, section 5 presents the forecasting approach applied to define the target level at the horizon 2020. Section 6 presents the results of the deterministic and stochastic forecasting models and section 7 concludes.
    Keywords: European Benchmark; Employability; Graduates; Forecasting; Simulations
    JEL: A22 A23 E27 I20 J21 C15 A21
    Date: 2011–09
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:37153&r=for
  9. By: Elif C. Arbatli; Garima Vasishtha
    Abstract: Demand for industrial raw materials from emerging economies, particularly emerging Asia, is widely believed to have fueled the surge in oil and industrial commodity prices during 2002-2008. The paper first presents a simple storage model in which commodity prices respond to market participant’s changing expectations of the future macroeconomic environment. In the model, the change in the price of a commodity depends on the unanticipated changes in demand factors, along with the real exchange rate, the real interest rate, and other factors that affect the marginal convenience yield. It then focuses on the role of demand factors by using a newly constructed monthly measure of unanticipated demand shocks for commodities based on revisions to professional forecasts of industrial production growth for a large group of emerging market and advanced economies. The empirical framework also controls for other macroeconomic factors that affect commodity prices, such as the real effective exchange rate (REER) of the U.S. dollar and the real interest rate. The results show that revisions to growth forecasts for emerging Asia play an important role in explaining movements in the real prices of industrial metals. In addition, the REER of the U.S. dollar is an important determinant of industrial commodity prices. For crude oil, growth forecast revisions for the U.S. and the real interest rate play a significant role in explaining real prices. Furthermore, growth surprises in general fall short of explaining the fast run-up in most commodity prices during 2006-2008, and the magnitude of the collapse in prices during the recent global financial crisis.
    Keywords: Econometric and statistical methods; International topics
    JEL: Q41 Q43
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:bca:bocawp:12-8&r=for
  10. By: Lanne, Markku; Meitz, Mika; Saikkonen, Pentti
    Abstract: We develop likelihood-based tests for autocorrelation and predictability in a first order non- Gaussian and noninvertible ARMA model. Tests based on a special case of the general model, referred to as an all-pass model, are also obtained. Data generated by an all-pass process are uncorrelated but, in the non-Gaussian case, dependent and nonlinearly predictable. Therefore, in addition to autocorrelation the proposed tests can also be used to test for nonlinear predictability. This makes our tests different from their previous counterparts based on conventional invertible ARMA models. Unlike in the invertible case, our tests can also be derived by standard methods that lead to chi-squared or standard normal limiting distributions. A further convenience of the noninvertible ARMA model is that, to some extent, it can allow for conditional heteroskedasticity in the data which is useful when testing for predictability in economic and financial data. This is also illustrated by our empirical application to U.S. stock returns, where our tests indicate the presence of nonlinear predictability.
    Keywords: Non-Gaussian time series; noninvertible ARMA model; all-pass process; predictability of asset returns
    JEL: C53 G12 C22
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:37151&r=for
  11. By: Raffaella Calabrese (Dynamics Lab, Geary Institute, University College Dublin)
    Abstract: In evaluating credit scoring predictive power it is common to use the Re-ceiver Operating Characteristics (ROC) curve, the Area Under the Curve(AUC) and the minimum probability-weighted loss. The main weakness of the rst two assessments is not to take the costs of misclassication errors into account and the last one depends on the number of defaults in the credit portfolio. The main purposes of this paper are to provide a curve, called curve of Misclassication Error Loss (MEL), and a classier performance measure that overcome the above-mentioned drawbacks. We prove that the ROC dominance is equivalent to the MEL dominance. Furthermore, we derive the probability distribution of the proposed predictive power measure and we analyse its performance by Monte Carlo simulations. Finally, we apply the suggested methodologies to empirical data on Italian Small and Medium Enterprisers.
    Keywords: Performance Assessment, Credit Scoring Modules, Monte Carlo simulations, Italian Enterprisers
    Date: 2012–02–20
    URL: http://d.repec.org/n?u=RePEc:ucd:wpaper:201204&r=for

This nep-for issue is ©2012 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.