Forecasting
http://lists.repec.orgmailman/listinfo/nep-for
Forecasting
2017-04-09
Does foreign sector help forecast domestic variables in DSGE models?
http://d.repec.org/n?u=RePEc:ekd:009007:9393&r=for
Estimated dynamic stochastic general equilibrium (DSGE) models are now used around the world for policy analysis. They have become particularly popular in central banks, some of which successfully applied them to generate macroeconomic forecasts. Arguably, one of the key drivers behind this trend was growing evidence that DSGE model-based forecasts can be competitive with those obtained with flexible time series models such as vector autoregressions (VAR), and also with expert judgement.See e.g. Smets and Wouters (2007), Edge et al. (2010), Kolasa et al. (2012) and Del Negro and Schorfheide (2012). The vast majority of these studies focus on the US economy as it allows to evaluate the forecast quality over a relatively large number of periods, and also makes the convenient closed economy assumption acceptable. The open economy applications that use the New Open Macroeconomics (NOEM) framework originating from Obstfeld and Rogoff (1995) and extended by Devereux and Engel (2003) and Gali and Monacelli (2005) do exist, but usually base their conclusions on a rather short evaluation sample. The earliest contribution to this literature is Bergin (2003) who tests small open economy DSGE models for Australia, Canada and the United Kingdom, and Bergin (2006) where a two-country model for the US and G7 is considered. However, only in-sample forecasts are evaluated in these papers. The literature testing open economy DSGE model-base forecasts out of sample include: Adolfson et al. (2007) and Christoffel et al. (2010) for the euro area, Adolfson et al. (2008) for Sweden, Matheson (2010) for Australia, Canada and New Zealand, Gupta et al. (2010) and Alpanda et al. (2011) for South Africa, Marcellino et al. (2014) for Luxemburg within the euro area. Following the literature working with closed economy models, the common practice is to evaluate forecasts generated with a NOEM framework relative to those obtained with some variants of Bayesian VARs. The overall finding is that open economy DSGE models are quite competitive, even though the conclusions differ by variables and countries. However, none of these studies tells us how much we actually gain by accounting for a foreign block in DSGE models. Since this question is essentially about the empirical validity of the key NOEM ingredients, i.e. those theoretical additions over the standard quantitative business cycle framework that are related to an open economy dimension, we argue that not having an aswer to it can be considered an important gap. Actually, there are reasons to be sceptical about the empirical success of the NOEM framework. In an influential paper Justiniano and Preston (2010) demonstrate that estimated small open economy DSGE models fail to account for the substantial influence of foreign shocks identified in many reduced-form studies. They show that capturing the observed comovement between domestic and foreign variables generates counterfactual implications for other variables, especially for the real exchange rate and terms of trade. It is also well-known that NOEM models have difficulties in explaining swings in the exchange rates and current account balances (Engel, 2014; Gourinchas and Rey, 2014). On the bright sight, Ca'Zorzi et al. (2016) show that real exchange rate forecasts from small open economy DSGE models beat the random walk at medium and long horizons and are very competitive with tougher benchmarks. In this paper we evaluate the forecasting performance of a state-of-the-art small open economy NOEM model developed by Justiniano and Preston (2010) relative to its associated New Keynesian (NK) closed economy benchmark. We focus on the forecast accuracy for three standard macrovariables showing up in both models: output, inflation and the short-term interest rate. Several variants of the NOEM model are considered that differ in the set of foreign sector variables used in estimation, which include the real exchange rate, terms of trade, current account balance, as well as foreign output, inflation and interest rates. Our conclusions are based on evidence from three economies, i.e. Australia, Canada and the United Kingdom, for which we can collect data that date back far enough to make our evaluation sample large when compared to the previous studies. Our findings are mainly negative. When we consider the full NOEM model, its point and density forecasts for domestic variables are statistically indistinguishable from, and in most cases even significantly less accurate than, those produced by the standard NK benchmark. Alternative model variants that leave either terms of trade or both terms of trade and foreign variables unobservable do not fare much better, and also do not offer much improvement relative to the closed economy variant. We show that these results are consistent with evidence from BVARs as expanding their dimension with the foreign sector variables also does not usually lead to improvement in their forecasting performance. However, this feature of BVARs is not surprising in light of the earlier literature stressing that small-scale VARs can forecast much better than large-scale VARs (Gurkaynak et al., 2013) as the number of estimated parameters grows very fast with the number of variables included in this class of models. In contrast, the open economy DSGE model considered in this paper is still very scarcely parameterized, hence its lack of competitiveness relative to the closed economy benchmark points at important misspecification of the underlying theoretical structure. We provide support for this claim by showing that even the richly specified NOEM model generates several counterfactual predictions about the comovement between domestic and foreign sector variables.
Marcin Kolasa
Michal Rubaszek
Australia, Canada, United Kingdom, Forecasting and projection methods, General equilibrium modeling
2016-07-04
Assessing the Business Outlook Survey Indicator Using Real-Time Data
http://d.repec.org/n?u=RePEc:bca:bocadp:17-5&r=for
Every quarter, the Bank of Canada conducts quarterly consultations with businesses across Canada, referred to as the Business Outlook Survey (BOS). A principal-component analysis conducted by Pichette and Rennison (2011) led to the development of the BOS indicator, which summarizes survey results and is used by the Bank as a gauge of overall business sentiment. In this paper, we examine whether data vintages matter when assessing the predictive content of the BOS indicator and individual BOS questions and whether the BOS is a better indicator of revised or unrevised macroeconomic data. As an indicator of business sentiment in the context of monetary policy, the reliability of the BOS is essential, and it is crucial to understand whether the signals it sends are best interpreted for early-released or revised data. For this purpose, we use different methods of forecasting that take into account the real-time perspective of the data. Results from the different methods show that the BOS content is informative regardless of data revisions. However, in real time, the BOS indicator and individual BOS questions are found to produce better nowcasts of first-released data or partially revised data than of latest-available data. This is particularly important in the case of growth in real business investment. In fact, because revisions to real business investment are more volatile than revisions to real gross domestic product (GDP), the choice of data vintages when assessing the ability of the BOS to forecast growth appears to be more important for real business investment than for real GDP.
Lise Pichette
Marie-Noëlle Robitaille
Business fluctuations and cycles, Regional economic developments
2017
Energy Scenarios: The Value and Limits of Scenario Analysis
http://d.repec.org/n?u=RePEc:ekd:009007:9371&r=for
A need for low-carbon world has added a new challenging dimension for the long-term energy scenarios development. In addition to the traditional factors like technological progress, demographic, economic, political and institutional considerations, there is another aspect of the modern energy forecasts related to the coverage, timing, and stringency of policies to mitigate greenhouse gas emissions and air pollutants. The goal of this paper is to review the value and limits of energy scenarios and, in particular, to assess how the new low-carbon goals are reflected in the latest projections. This relatively new dimension of the scenarios means that in addition to the traditional factors like technology development, demographic, economic, political and institutional considerations, there is another aspect of the modern energy forecasts related to the coverage, timing, and stringency of policies to mitigate greenhouse gas emissions and air pollutants. The results from a long-term global energy-economic model, the MIT Economic Projection and Policy Analysis (EPPA) model, are compared with other major outlooks (like BP, ExxonMobil, IEA) and model-comparison exercises (represented in the IPCC scenario database). Considering the value and limits of the energy scenarios, it is obviously easier to find the limits of the forecasts. It is true not only about the energy projections, but also about other predictions of the future: financial, economic or political. Forecasts of all sorts are usually bad at predicting sudden changes. In terms of energy projections, fast growth of China’s energy appetite and its recent slowdown, fast development of unconventional oil and natural gas, fast deployment of renewables, all these events are missed by most scenarios. A move to a low-carbon energy future requires a drastic change in energy investment and the resulting mix in energy technologies. If history is any guide, energy scenarios overestimate the extent to which the future will look like the recent past. Energy scenarios are useful for decision-makers to assess the scale of the necessary transformation. However, the exact technology mix, paths to the needed mix, price and cost projections should be treated with a great degree of caution. The scenarios do not provide the exact numbers (or even close numbers), but they can be used as a qualitative analysis of decision-making risks associated with different pathways. We should recognize the energy system is complex, interconnected and affected by economic drivers. In turn, economists are notorious for their forecasting ability. Due to a long-lasting nature of energy infrastructure, energy system is not as fluid as economic system, and some trajectories in energy development are more persistent, but the same degree of carefulness should be applied to the long-term energy forecasts as to economic forecasts. Energy scenarios models are complex, but they do not reflect all the complexity, so they often provide imprecise projections. At the same time, without models nothing at all constrains the projections. While indeed energy scenarios are not suited for providing the exact number (or specific forecast), but practically decisions have to be made. The value of energy scenarios (and models that produce them) is not in their decision-making capabilities, but in their decision-support capabilities. Any single energy scenario is not going to provide a prediction of the future, and it cannot be used as a basis for policy-making. However, the results from numerous scenarios obtained from different modelling planforms provide useful information about potential risks and benefits of a particular potential policy or investment. When one has a model to make a scenario – an argument can be made about improvement, simplification, or bringing additional details. When one has just tea leaves to guess the future, there is no tool to advance the knowledge. Most of the energy scenarios offer plausible rather than most likely future. Perhaps the most important uncertainty about the future of energy is its interaction with the environment. The need for low-emitting technologies will shift the current technology mix, but the exact contribution of particular technology and the timing of this shift depend on many economic and political variables. Such uncertainty about the future costs and technologies supports a conclusion that governments should not try to pick the “winners”, rather the policy and investment focus should be on targeting emissions reductions from any energy source. Energy scenarios may not provide the exact projections, but they are the best available tool to assess the magnitude of challenges that lie ahead.
Sergey Paltsev
Global, Impact and scenario analysis, Energy and environmental policy
2016-07-04
Conditional forecasting with DSGE models - A conditional copula approach
http://d.repec.org/n?u=RePEc:bno:worpap:2017_04&r=for
DSGE models may be misspecified in many dimensions, which can affect their forecasting performance. To correct for these misspecifications we can apply conditional information from other models or judgment. Conditional information is not accurate, and can be provided as a probability distribution over different outcomes. These probability distributions are often provided by a set of marginal distributions. To be able to condition on this information in a structural model we must construct the multivariate distribution of the conditional information, i.e. we need to draw multivariate paths from this distribution. One way to do this is to draw from the marginal distributions given a correlation structure between the different marginal distributions. In this paper we use the theoretical correlation structure of the model and a copula to solve this problem. The copula approach makes it possible to take into account more flexible assumption on the conditional information, such as skewness and/or fat tails in the marginal density functions. This method may not only improve density forecasts from the DSGE model, but can also be used to interpret the conditional information in terms of structural shocks/innovations.
Kenneth Sæterhagen Paulsen
DSGE model, conditional forecast, copula
2017-04-07
Modelling and forecasting money demand: divide and conquer
http://d.repec.org/n?u=RePEc:apc:wpaper:2017-091&r=for
The literature on money demand suggests several specification forms of empirical functions that better describe observed data on money in circulation. In a first stage, we select the best long-run model specification for a money demand function at the aggregate level based on forecast performance. On a second stage we divide the money in circulation by denomination and argue that determinants of a low-level denomination is different than those of a high-level. We then estimate the best model specification for each denomination and aggregate each forecast in order to have an aggregate proyection. We finally compare forecasts between these strategies. Our results indicate that the bottom-up approach has a better performance than the traditional view of directly forecasting the aggregate.
César Carrera
Jairo Flores
Money demand, bottom-up, co-integration, forecast
2017-04
Intuitive and Reliable Estimates of the Output Gap from a Beveridge-Nelson Filter
http://d.repec.org/n?u=RePEc:swe:wpaper:2016-09a&r=for
The Beveridge-Nelson (BN) trend-cycle decomposition based on autoregressive forecasting models of U.S. quarterly real GDP growth produces estimates of the output gap that are strongly at odds with widely-held beliefs about the amplitude, persistence, and even sign of transitory movements in economic activity. These antithetical attributes are related to the autoregressive coefficient estimates implying a very high signal-to-noise ratio in terms of the variance of trend shocks as a fraction of the overall quarterly forecast error variance. When we impose a lower signal-to-noise ratio, the resulting BN decomposition, which we label the “BN filter”, produces a more intuitive estimate of the output gap that is large in amplitude, highly persistent, and typically increases in expansions and decreases in recessions. Real-time estimates from the BN filter are also reliable in the sense that they are subject to smaller revisions and predict future output growth and inflation better than estimates from other methods of trend-cycle decomposition that also impose a low signal-to-noise ratio, including deterministic detrending, the Hodrick-Prescott filter, and the bandpass filter.
Gunes Kamber
James Morley
Benjamin Wong
Beveridge-Nelson decomposition, output gap, signal-to-noise ratio
2017-01
Modelling and forecasting WIG20 daily returns
http://d.repec.org/n?u=RePEc:nip:nipewp:09/2017&r=for
The purpose of this paper is to model daily returns of the WIG20 index. The idea is to consider a model that explicitly takes changes in the amplitude of the clusters of volatility into account. This variation is modelled by a positive-valued deterministic component. A novelty in specification of the model is that the deterministic component is specified before estimating the multiplicative conditional variance component. The resulting model is subjected to misspecification tests and its forecasting performance is compared with that of commonly applied models of conditional heteroskedasticity.
Cristina Amado
Annastiina Silvennoinen
Timo Teräsvirta
2017
Cohort effects in mortality modelling: a Bayesian state-space approach
http://d.repec.org/n?u=RePEc:arx:papers:1703.08282&r=for
Cohort effects are important factors in determining the evolution of human mortality for certain countries. Extensions of dynamic mortality models with cohort features have been proposed in the literature to account for these factors under the generalised linear modelling framework. In this paper we approach the problem of mortality modelling with cohort factors incorporated through a novel formulation under a state-space methodology. In the process we demonstrate that cohort factors can be formulated naturally under the state-space framework, despite the fact that cohort factors are indexed according to year-of-birth rather than year. Bayesian inference for cohort models in a state-space formulation is then developed based on an efficient Markov chain Monte Carlo sampler, allowing for the quantification of parameter uncertainty in cohort models and resulting mortality forecasts that are used for life expectancy and life table constructions. The effectiveness of our approach is examined through comprehensive empirical studies involving male and female populations from various countries. Our results show that cohort patterns are present for certain countries that we studied and the inclusion of cohort factors are crucial in capturing these phenomena, thus highlighting the benefits of introducing cohort models in the state-space framework. Forecasting of cohort models is also discussed in light of the projection of cohort factors.
Man Chung Fung
Gareth W. Peters
Pavel V. Shevchenko
2017-03