nep-for New Economics Papers
on Forecasting
Issue of 2013‒12‒15
fourteen papers chosen by
Rob J Hyndman
Monash University

  1. Comparing variable selection techniques for linear regression: LASSO and Autometrics. By Camila Epprecht; Dominique Guegan; Álvaro Veiga
  2. Real-Time Forecasting with a Mixed-Frequency VAR By Frank Schorfheide; Dongho Song
  3. Out-of-sample forecast tests robust to the choice of window size By Barbara Rossi; Atsushi Inoue
  4. Time Series under Present-Value-Model Short- and Long-run Co-movement Restrictions By Osmani Teixeira de Carvalho Guillén; Alain Hecq; João Victor Issler; Diogo Saraiva
  5. The changing relationship between commodity prices and equity prices in commodity exporting By Barbara Rossi
  6. On the Benefits of Equicorrelation By Adam Clements; Ayesha Scott; Annastiina Silvennoinen
  7. How will the court decide? Tax experts and the estimation of tax risk By Blaufus, Kay; Bob, Jonathan; Trinks, Matthias
  8. Have Financial Markets Become More Informative? By Jennie Bai; Thomas Philippon; Alexi Savov
  9. Over-optimistic official forecasts and fiscal rules in the eurozone By Jeffrey Frankel; Jesse Schreger
  10. Estimation and Inference under Weak Identi…cation and Persistence: An Application on Forecast-Based Monetary Policy Reaction Function By Jui-Chung Yang; Ke-Li Xu
  11. Resampling in DEA By Kaoru Tone
  12. Incentive Effects of Fiscal Rules on the Finance Minister’s Behaviour: Evidence from Revenue Projections in Swiss Cantons By Florian Chatagny
  13. The Out-of-Sample Failure of Empirical Exchange Rate Models: Sampling Error or Misspecification? By Richard Meese; Kenneth Rogoff; Jacob Frenkel
  14. DSGE models in the frequency domain By Luca Sala

  1. By: Camila Epprecht (Centre d'Economie de la Sorbonne and Department of Electrical Engineering-Pontifical Catholic University of Rio de Janeiro); Dominique Guegan (Centre d'Economie de la Sorbonne - Paris School of Economics); Álvaro Veiga (Department of Electrical Engineering-Pontifical Catholic University of Rio de Janeiro)
    Abstract: In this paper, we compare two different variable selection approaches for linear regression models: Autometrics (automatic general-to-specific selection) and LASSO (?1-norm regularization). In a simulation study, we show the performance of the methods considering the predictive power (forecast out-of-sample) and the selection of the correct model and estimation (in-sample). The case where the number of candidate variables exceeds the number of observation is considered as well. We also analyze the properties of estimators comparing to the oracle estimator. Finally, we compare both methods in an application to GDP forecasting.
    Keywords: Model selection, variable selection, GETS, Autometrics, LASSO, adaptive LASSO, sparse models, oracle property, time series, GDP forecasting.
    JEL: C51 C52 C53
    Date: 2013–11
  2. By: Frank Schorfheide; Dongho Song
    Abstract: This paper develops a vector autoregression (VAR) for time series which are observed at mixed frequencies - quarterly and monthly. The model is cast in state-space form and estimated with Bayesian methods under a Minnesota-style prior. We show how to evaluate the marginal data density to implement a data-driven hyperparameter selection. Using a real-time data set, we evaluate forecasts from the mixed-frequency VAR and compare them to standard quarterly-frequency VAR and to forecasts from MIDAS regressions. We document the extent to which information that becomes available within the quarter improves the forecasts in real time.
    JEL: C11 C32 C53
    Date: 2013–12
  3. By: Barbara Rossi; Atsushi Inoue
    Abstract: This paper proposes new methodologies for evaluating out-of-sample forecasting performance that are robust to the choice of the estimation window size. The methodologies involve evaluating the predictive ability of forecasting models over a wide range of window sizes. We show that the tests proposed in the literature may lack the power to detect predictive ability and might be subject to data snooping across different window sizes if used repeatedly. An empirical application shows the usefulness of the methodologies for evaluating exchange rate models' forecasting ability.
    Keywords: Predictive Ability Testing, Forecast Evaluation, Estimation Window.
    JEL: C22 C52 C53
    Date: 2012–04
  4. By: Osmani Teixeira de Carvalho Guillén; Alain Hecq; João Victor Issler; Diogo Saraiva
    Abstract: This paper has two original contributions. First, we show that PV relationships entail a weak-form SCCF restriction, as in Hecq et al. (2006) and in Athanasopoulos et al. (2011), and implies a polynomial serial correlation common feature relationship (Cubadda and Hecq, 2001). These represent short-run restrictions on the dynamic multivariate systems, something that has not been discussed before. Our second contribution relates to forecasting multivariate time series that are subject to PVM restrictions, which has a wide application in macroeconomics and finance. We benefit from previous work showing the benefits for forecasting when the short-run dynamics of the system is constrained. The reason why appropriate common-cycle restrictions improve forecasting is because it finds linear combinations of the first differences of the data that cannot be forecast by past information. This embeds natural exclusion restrictions preventing the estimation of useless parameters, which would otherwise contribute to the increase of forecast variance with no expected reduction in bias. We applied the techniques discussed in this paper to data known to be subject to PV restrictions: the online series maintained and updated by Robert J. Shiller at We focus on three different data sets. The first includes the levels of interest rates with long and short maturities, the second includes the level of real price and dividend for the S&P composite index, and the third includes the logarithmic transformation of prices and dividends. Our exhaustive investigation of six different multivariate models reveals that better forecasts can be achieved when restrictions are applied to them. Specifically, cointegration restrictions, and cointegration and weak-form SCCF rank restrictions, as well as all the set of theoretical restrictions embedded in the PVM
    Date: 2013–10
  5. By: Barbara Rossi
    Abstract: We explore the linkage between equity and commodity markets, focusing in particular on its evolution over time. We document that a country's equity market value has significant out-of-sample predictive ability for the future global commodity price index for several primary commodity-exporting countries. The out-of-sample predictive ability of the equity market appears around 2000s. The results are robust to using several control variables as well as firm-level equity data. Finally, our results indicate that exchange rates are a better predictor of commodity prices than equity markets, especially at very short horizons.
    Keywords: Commodity prices, equity prices, exchange rates, forecasting.
    JEL: C22 C52 C53
    Date: 2012–10
  6. By: Adam Clements (QUT); Ayesha Scott (QUT); Annastiina Silvennoinen (QUT)
    Abstract: The importance of covariance modelling has long been recognised in the field of portfolio management and large dimensional multivariate problems are increasingly becoming the focus of research. This paper provides a straightforward and commonsense approach toward investigating a number of models used to generate forecasts of the covariance matrix for large dimensional problems. We find evidence in favour of assuming equicorrelation across various portfolio sizes, particularly during times of crisis. A portfolio allocation problem is used to compare forecasting methods. The global minimum variance portfolio and Model Confidence Set are used to compare methods, whilst portfolio weight stability and economic value are also considered.
    Keywords: Volatility, multivariate GARCH, portfolio allocation
    JEL: C22 G11 G17
    Date: 2013–12–11
  7. By: Blaufus, Kay; Bob, Jonathan; Trinks, Matthias
    Abstract: Tax accounting and tax law concern the probability thresholds that can require the taxpayer to estimate the likelihood that a tax position would be upheld by a court. Tax complexity and the consequent ambiguity results in a reliance by most taxpayers on a tax expert estimate of this likelihood. This study examines whether the tax experts are able to accurately forecast the outcome of tax court decisions and compares tax expert predictions to those of laymen. Our results reveal no significant differences with respect to the forecasting performance of professional tax advisors and laymen. Moreover, the tax advisors exhibit a significantly higher level of overconfidence compared to laymen and the degree of overconfidence increases with professional experience. A comparison of two groups of tax experts, tax advisors and revenue agents demonstrates that the tax advisors exhibit the highest level of overconfidence and form stronger appeal recommendations that indicate a type of advisor bias. --
    Keywords: tax risk,overconfidence,client advocacy,tax controversy,forecasting
    JEL: M40 K20 H20
    Date: 2013
  8. By: Jennie Bai; Thomas Philippon; Alexi Savov
    Abstract: The finance industry has grown, financial markets have become more liquid, and information technology allows arbitrageurs to trade faster than ever. But have market prices then become more informative? We use stock and bond prices to forecast earnings and find that the information content of market prices has not improved since 1960. We use a model with information acquisition and investment to link financial development, price informativeness, and allocational efficiency. As information costs fall, the predictable component of future earnings should rise and hence improve capital allocation and welfare. We find that this component has remained stable over the past 50 years. When we decompose price informativeness into real price efficiency and forecasting price efficiency, we find that both have remained stable.
    JEL: E2 G1 N2
    Date: 2013–12
  9. By: Jeffrey Frankel; Jesse Schreger
  10. By: Jui-Chung Yang; Ke-Li Xu
    Abstract: The reaction coefficients of expected inflations and output gaps in the forecast-based monetary policy reaction function may be merely weakly …identified when the smoothing coefficient is close to unity and the nominal interest rates are highly persistent. In this paper we modify the method of Andrews and Cheng (2012, Econometrica)on inference under weak / semi-strong identification to accommodate the persistence issue. Our modification involves the employment of asymptotic theories for near unit root processes and novel drifting sequence approaches. Large sample properties with a desired smooth transition with respect to the true values of parameters are developed for the nonlinear least squares (NLS) estimator and its corresponding t / Wald statistics of a general class of models. Despite the not-consistent-estimability, the conservative confidence sets of weakly-identified parameters of interest can be obtained by inverting the t / Wald tests. We show that the null-imposed least-favorable confidence sets will have correct asymptotic sizes, and the projection-based method may lead to asymptotic over-coverage. Our empirical application suggests that the NLS estimates for the reaction coefficients in U.S.Âs forecast-based monetary policy reaction function for 1987:3Â2007:4 are not accurate sufficiently to rule out the possibility of indeterminacy.
    JEL: C12 C22 E58
    Date: 2013–12–08
  11. By: Kaoru Tone (National Graduate Institute for Policy Studies)
    Abstract: In this paper, we propose new resampling models in data envelopment analysis (DEA). Input/output values are subject to change for several reasons, e.g., measurement errors, hysteretic factors, arbitrariness and so on. Furthermore, these variations differ in their input/output items and their decision-making units (DMU). Hence, DEA efficiency scores need to be examined by considering these factors. Resampling based on these variations is necessary for gauging the confidence interval of DEA scores. We propose three resampling models. The first one assumes downside and upside measurement error rates for each input/output, which are common to all DMUs. We resample data following the triangular distribution that the downside and upside errors indicate around the observed data. The second model utilizes historical data, e.g., past-present, for estimating data variations, imposing chronological order weights which are supplied by Lucas series (a variant of Fibonacci series). The last one deals with future prospects. This model aims at forecasting the future efficiency score and its confidence interval for each DMU.
    Date: 2013–12
  12. By: Florian Chatagny (KOF Swiss Economic Institute, ETH Zurich, Switzerland)
    Abstract: Predicting available tax revenue accurately is a key step of fiscal policy. It has recently been shown that revenue prediction errors have a direct impact on fiscal deficits. In the current paper we explore the relationship between the ideology of the finance minister and tax revenue projection errors and assess how the stringency of fiscal rules does alter this relationship. We use a panel dataset on 26 Swiss cantons over the period 1980-2007 as well as a new dataset on 99 finance ministers at the cantonal level. We find a rather counter-intuitive positive relationship between the ideology of the finance minister and tax revenue projection errors in the sense that a more left wing finance minister produces relatively more conservative forecasts. We also find that fiscal rules reduce the effect of ideology on tax revenue projection errors. These results suggest that left wing finance ministers need to curb deficits relatively more in order to signal the same level of competence than a right wing finance minister to the voters. It also suggests that fiscal rules render the signal less informative to the voters and thereby reduce the incentive for left wing finance ministers to be more conservative in their projections.
    Keywords: Ideology, Finance minister, Fiscal Rules, Tax Revenue Projections
    JEL: C23 H68 H71
    Date: 2013–12
  13. By: Richard Meese; Kenneth Rogoff; Jacob Frenkel
  14. By: Luca Sala
    Abstract: We use frequency domain techniques to estimate a medium-scale DSGE model on different frequency bands. We show that goodness of t, forecasting performance and parameter estimates vary substantially with the frequency bands over which the model is estimated. Estimates obtained using subsets of frequencies are characterized by signicantly different parameters, an indication that the model cannot match all frequencies with one set of parameters. In particular, we find that: i) the low frequency properties of the data strongly affect parameter estimates obtained in the time domain; ii) the importance of economic frictions in the model changes when different subsets of frequencies are used in estimation. This is particularly true for the investment cost friction and habit persistence: when low frequencies are present in the estimation, the investment cost friction and habit persistence are estimated to be higher than when low frequencies are absent. JEL Classication: C11, C32, E32 Keywords: DSGE models, frequency domain, band maximum likelihood.
    Date: 2013

This nep-for issue is ©2013 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.