nep-for New Economics Papers
on Forecasting
Issue of 2006‒10‒21
six papers chosen by
Rob J Hyndman
Monash University

  1. Forecasting with a forward-looking DGE model: combining long-run views of financial markets with macro forecasting By Männistö , Hanna-Leena
  2. Robust volatility forecasts and model selection in financial time series By L. Grossi; G. Morelli
  3. Forecasting and testing a non-constant volatility By Abramov, Vyacheslav; Klebaner, Fima
  4. Inflation expectations and regime shifts in the euro area By Virén , Matti
  5. Demand for storage of natural gas in northwestern Europe. A simulation based forecast 2006-2030 By Felix Hoeffler; Madjid Kuebler
  6. On ignoring scientific evidence: The bumpy road to enlightenment By Robin Hogarth

  1. By: Männistö , Hanna-Leena (Bank of Finland Research)
    Abstract: To develop forecasting procedures with a forward-looking dynamic general equilibrium model, we built a small New-Keynesian model and calibrated it to euro area data. It was essential in this context that we allowed for long-run growth in GDP. We brought additional asset price equations based on the expecta-tions hypothesis and the Gordon growth model, into the standard open economy model, in order to extract information on private sector long-run expectations on fundamentals, and to combine that information into the macro economic forecast. We propose a method of transforming the model in forecasting use in such a way, as to match, in an economically meaningful way, the short-term forecast levels, especially of the model's jump-variables, to the parameters affecting the long-run trends of the key macroeconomic variables. More specifically, in the model we have used for illustrative purposes, we pinned down the long-run inflation expectations and domestic and foreign potential growth-rates using the model's steady state solution in combination with, by assumption, forward looking information in up-to-date financial market data. Consequently, our proposed solution preserves consistency with market expectations and results, as a favourable by-product, in forecast paths with no initial, first forecast period jumps. Further-more, no ad hoc re-calibration is called for in the proposed forecasting procedures, which clearly is an advantage from point of view of transparency in communication.
    Keywords: forecasting; New Keynesian model; DSGE model; rational expectations; open economy
    JEL: E17 E30 E31 F41
    Date: 2005–10–11
    URL: http://d.repec.org/n?u=RePEc:hhs:bofrdp:2005_021&r=for
  2. By: L. Grossi; G. Morelli
    Abstract: In order to cope with the stylized facts of financial time series, many models have been proposed inside the GARCH family (e.g. EGARCH, GJR-GARCH, QGARCH, FIGARCH, LSTGARCH) and the stochastic volatility models (e.g. SV). Generally, all these models tend to produce very similar results as concerns forecasting performance. Most of the time it is difficult to choose which is the most appropriate specification. In addition, all these models are very sensitive to the presence of atypical observations. The purpose of this paper is to provide the user with new robust model selection procedures in financial models which downweight or eliminate the effect of atypical observations. The extreme case is when outliers are treated as missing data. In this paper we extend the theory of missing data to the family of GARCH models and show how to robustify the loglikelihood to make it insensitive to the presence of outliers. The suggested procedure enables us both to detect atypical observations and to select the best models in terms of forecasting performance.
    Keywords: GARCH models, extreme value, robust estimation
    JEL: C16 C22 C53 G15
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:par:dipeco:2006-se02&r=for
  3. By: Abramov, Vyacheslav; Klebaner, Fima
    Abstract: In this paper we study volatility functions. Our main assumption is that the volatility is deterministic or stochastic but driven by a Brownian motion independent of the stock. We propose a forecasting method and check the consistency with option pricing theory. To estimate the unknown volatility function we use the approach of \cite{Goldentayer Klebaner and Liptser} based on filters for estimation of an unknown function from its noisy observations. One of the main assumptions is that the volatility is a continuous function, with derivative satisfying some smoothness conditions. The two forecasting methods correspond to the the first and second order filters, the first order filter tracks the unknown function and the second order tracks the function and it derivative. Therefore the quality of forecasting depends on the type of the volatility function: if oscillations of volatility around its average are frequent, then the first order filter seems to be appropriate, otherwise the second order filter is better. Further, in deterministic volatility models the price of options is given by the Black-Scholes formula with averaged future volatility \cite{Hull White 1987}, \cite{Stein and Stein 1991}. This enables us to compare the implied volatility with the averaged estimated historical volatility. This comparison is done for five companies and shows that the implied volatility and the historical volatilities are not statistically related.
    Keywords: Non-constant volatility; approximating and forecasting volatility; Black-Scholes formula; best linear predictor
    JEL: G13
    Date: 2006–06–06
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:207&r=for
  4. By: Virén , Matti (Bank of Finland and University of Turku)
    Abstract: This paper focuses on the determination of inflation expectations. The following two questions are exam-ined: How much do inflation expectations reflect different economic and institutional regime shifts and in which way do inflation expectations adjust to past inflation? The basic idea in the analysis is an assump-tion that inflation expectations do not mechanically reflect past inflation as may econometric specification de facto assume but rather they depend on the relevant economic regime. Also the adjustment of expecta-tions to past inflation is different in different inflation regimes. The regime analysis is based on panel data from EMU/EU countries for the period 1973–2004, while the inflation adjustment analysis mainly uses the Kalman filter technique for individual countries for the same period. Expectations (forecasts) are de-rived from OECD data. Empirical results strongly favour the regime-sensitivity hypothesis and provide an explanation for the poor performance of conventional estimation procedures in the context of Phillips curves.
    Keywords: inflation expectations; Kalman filter; stability
    JEL: E32 E37
    Date: 2005–10–11
    URL: http://d.repec.org/n?u=RePEc:hhs:bofrdp:2005_025&r=for
  5. By: Felix Hoeffler (Max Planck Institute for Research on Collective Goods, Bonn); Madjid Kuebler (Team Consult, Berlin, Germany)
    Abstract: The seasonal demand for natural gas requires supply flexibility. This “swing” is now largely provided in northwestern Europe by indigenous production. Declining reserves will increase the dependency on imports from far-off sources, which are less flexible. Hence, flexibility must be provided by additional storage. We estimate that in 2030 (depending on the desired level of security of supply) between 11 and 37 billion cubic meter of working gas volume will be required, in addition to the existing 37 billion cubic meters. This estimation is based on production and consumption forecasts for natural gas and observations of the relationship between the supply and demand of gas and the supply and demand of flexibility in the period 1995-2004. Scenarios and Monte Carlo simulations are provided to check for the robustness of our results. We also briefly discuss policy implications for the regulation of third party access to storage facilities.
    Keywords: Natural gas, storage, swing, third-party access
    JEL: L98 L51 Q41
    Date: 2006–03
    URL: http://d.repec.org/n?u=RePEc:mpg:wpaper:2006_9&r=for
  6. By: Robin Hogarth
    Abstract: It is well accepted that people resist evidence that contradicts their beliefs. Moreover, despite their training, many scientists reject results that are inconsistent with their theories. This phenomenon is discussed in relation to the field of judgment and decision making by describing four case studies. These concern findings that “clinical” judgment is less predictive than actuarial models; simple methods have proven superior to more “theoretically correct” methods in times series forecasting; equal weighting of variables is often more accurate than using differential weights; and decisions can sometimes be improved by discarding relevant information. All findings relate to the apparently difficult-to-accept idea that simple models can predict complex phenomena better than complex ones. It is true that there is a scientific market place for ideas. However, like its economic counterpart, it is subject to inefficiencies (e.g., thinness, asymmetric information, and speculative bubbles). Unfortunately, the market is only “correct” in the long-run. The road to enlightenment is bumpy.
    Keywords: Decision making, judgment, forecasting , linear models, heuristics
    JEL: D81 M10
    Date: 2006–05
    URL: http://d.repec.org/n?u=RePEc:upf:upfgen:973&r=for

This nep-for issue is ©2006 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.