nep-ecm New Economics Papers
on Econometrics
Issue of 2009‒06‒03
25 papers chosen by
Sune Karlsson
Orebro University

  1. Bootstrap Tests of Stationarity¢Ó By James Morley; Tara M. Sinclair
  2. Efficiency in Large Dynamic Panel Models with Common Factor By Patrick GAGLIARDINI; Christian GOURIEROUX
  3. Multivariate methods for monitoring structural change By Groen, Jan J J; Kapetanios, George; Price, Simon
  4. A Meta-Distribution for Non-Stationary Samples By Dominique Guégan
  5. Comment to "Weak Instruments Robust tests in GMM and the New Keynesian Phillips curve" by Frank Kleibergen and Sophocles Mavroeidis By Fabio Canova
  6. To Combine Forecasts or to Combine Information? By Huiyu Huang; Tae-Hwy Lee
  7. How Linear Models Can Mask Non-Linear Causal Relationships. An Application to Family Size and Children's Education By Magne Mogstad and Matthew Wiswall
  8. Bayesian Analysis of Time-Varying Parameter Vector Autoregressive Model for the Japanese Economy and Monetary Policy By Jouchi Nakajima; Munehisa Kasuya; Toshiaki Watanabe
  9. Forecasting the Spanish economy with an Augmented VAR-DSGE model By Gonzalo Fernández-de-Córdoba; José L. Torres
  10. Basket Options on Heterogeneous Underlying Assets By Georges Dionne; Geneviève Gauthier; Nadia Ouertani
  11. Co-integration Rank Testing under Conditional Heteroskedasticity By Giuseppe Cavaliere; Anders Rahbek; A.M.Robert Taylor
  12. Stochastic volatility of volatility in continuous time By Ole E. Barndorff-Nielsen; Almut E. D. Veraart
  13. Real-time conditional forecasts with Bayesian VARs: An application to New Zealand By Chris Bloor; Troy Matheson
  14. Multipower Variation for Brownian Semistationary Processes By Ole E. Barndorff-Nielsen; José Manuel Corcuera; Mark Podolskij
  15. THE FISHER INFORMATION MATRIX IN DOUBLY CENSORED DATA FROM THE DAGUM DISTRIBUTION By Filippo Domma; Sabrina Giordano; Mariangela Zenga
  16. Are 'unbiased' forecasts really unbiased? Another look at the Fed forecasts By Tara M. Sinclair; Fred Joutz; Herman O. Stekler
  17. Nonlinear Time Series in Financial Forecasting By Gloria González-Rivera; Tae-Hwy Lee
  18. Using wavelets to measure core inflation: the case in New Zealand By David Baqaee
  19. A Consistent Model of ‘Explosive’Financial Bubbles With Mean-Reversing Residuals By Li LIN; Ruo En REN; Didier SORNETTE
  20. Volatility Models : frrom GARCH to Multi-Horizon Cascades By Alexander Subbotin; Thierry Chauveau; Kateryna Shapovalova
  21. Google Econometrics and Unemployment Forecasting By Askitas, Nikos; Zimmermann, Klaus F.
  22. Evaluating Current Year Forecasts Made During the Year: A Japanese Example By H.O. Stekler; Kazuta Sakamoto
  23. Forecasting Expected Shortfall with a Generalized Asymmetric Student-t Distribution By Dongming Zhu; John Galbraith
  24. Multivariate Forecast Errors and the Taylor Rule By Edward N. Gamber; Tara M. Sinclair; H.O. Stekler; Elizabeth Reid
  25. The Effects of Monetary Policy on Unemployment Dynamics Under Model Uncertainty. Evidence from the US and the Euro Area By Carlo Altavilla; Matteo Ciccarelli

  1. By: James Morley (Department of Economics Washington University in St. Louis); Tara M. Sinclair (Department of Economics The George Washington University)
    Abstract: We compare the finite-sample performance of different stationarity tests. Monte Carlo analysis reveals that tests based on Lagrange multiplier (LM) statistics with nonstandard asymptotic distributions reject far more often than their nominal size for trend-stationary processes of the kind estimated for macroeconomic data. Bootstrap versions of these LM tests have empirical rejection probabilities that are closer to nominal size, but they still tend to over-reject. Meanwhile, we find that a bootstrap likelihood ratio (LR) test has very accurate finite-sample size, while at the same time having higher power than the bootstrap LM tests against empiricallyrelevant nonstationary alternatives. Based on the bootstrap LR test, and in some cases contrary to the bootstrap LM tests, we can reject trend stationarity for US real GDP, the unemployment rate, consumer prices, and payroll employment in favour of unit root processes with large permanent movements.
    Keywords: Stationarity Test, Unobserved Components, Parametric Bootstrap, Monte Carlo Simulation, Finite Sample Inference
    JEL: C12 C15 C22
    Date: 2008–10
    URL: http://d.repec.org/n?u=RePEc:gwc:wpaper:2008-11&r=ecm
  2. By: Patrick GAGLIARDINI (University of Lugano and Swiss Finance Institute); Christian GOURIEROUX (CREST, CEPREMAP (Paris) and University of Toronto)
    Abstract: This paper deals with efficient estimation in exchangeable nonlinear dynamic panel models with common unobservable factor. The specification accounts for both micro- and macro-dynamics, induced by the lagged individual observation and the common stochastic factor, respectively. For large cross-sectional and time dimensions, and under a semiparametric identification condition, we derive the efficiency bound and introduce efficient estimators for both the micro- and macro-parameters. In particular, we show that the fixed effects estimator of the micro-parameter is not only consistent, but also asymptotically efficient. The results are illustrated with the stochastic migration model for credit risk analysis.
    Keywords: Nonlinear Panel Model, Factor Model, Exchangeability, Systematic Risk, Efficiency Bound, Semi-parametric Efficiency, Fixed Effects Estimator, Bayesian Statistics, Stochastic Migration, Granularity
    JEL: C23 C13 G12
    Date: 2008–08
    URL: http://d.repec.org/n?u=RePEc:chf:rpseri:rp0912&r=ecm
  3. By: Groen, Jan J J (Federal Reserve Bank of New York); Kapetanios, George (Queen Mary and Westfield College); Price, Simon (Bank of England)
    Abstract: Detection of structural change is a critical empirical activity, but continuous 'monitoring' of time series for structural changes in real time raises well-known econometric issues. These have been explored in a univariate context. If multiple series co-break, as may be plausible, then it is possible that simultaneous examination of a multivariate set of data would help identify changes with higher probability or more rapidly than when series are examined on a case-by-case basis. Some asymptotic theory is developed for a maximum CUSUM detection test. Monte Carlo experiments suggest that there is an improvement in detection relative to a univariate detector over a wide range of experimental parameters, given a sufficiently large number of co-breaking series. The method is applied to UK RPI inflation in the period after 2001. A break is detected which would not have been picked up by univariate methods.
    Keywords: monitoring; structural change; panel; CUSUM; fluctuation test
    JEL: C10 C59
    Date: 2009–06–08
    URL: http://d.repec.org/n?u=RePEc:boe:boeewp:0369&r=ecm
  4. By: Dominique Guégan (PSE, Centre d’Economie de la Sorbonne, University Paris1 Panthéon-Sorbonne)
    Abstract: In this paper, we focus on the building of an invariant distribution function associated to a non-stationary sample. After discussing some specific problems encountered by non-stationarity inside samples like the "spurious" long memory effect, we build a sequence of stationary processes permitting to define the concept of meta-distribution for a given non-stationary sample. We use this new approach to discuss some interesting econometric issues in a non-stationary setting, namely forecasting and risk management strategy.
    Keywords: Non-Stationarity, Copula, Long-memory, Switching, Cumulants, Estimation theory
    JEL: C32 C51 G12
    Date: 2009–06–01
    URL: http://d.repec.org/n?u=RePEc:aah:create:2009-24&r=ecm
  5. By: Fabio Canova
    Abstract: I discuss the identifiability of a structural New Keynesian Phillips curve when it is embedded in a small scale dynamic stochastic general equilibrium model. Identification problems emerge because not all the structural parameters are recoverable from the semi-structural ones and because the objective functions I consider are poorly behaved. The solution and the moment mappings are responsible for the problems.
    Keywords: Identification, DSGE models, New Keynesian Phillips curve, Identification robust estimation methods
    JEL: C10 C52 E32 E50
    Date: 2009–01
    URL: http://d.repec.org/n?u=RePEc:upf:upfgen:1159&r=ecm
  6. By: Huiyu Huang (PanAgora Asset Management); Tae-Hwy Lee (Department of Economics, University of California Riverside)
    Abstract: When the objective is to forecast a variable of interest but with many explanatory variables available, one could possibly improve the forecast by carefully integrating them. There are generally two directions one could proceed: combination of forecasts (CF) or combination of information (CI). CF combines forecasts generated from simple models each incorporating a part of the whole information set, while CI brings the entire information set into one super model to generate an ultimate forecast. Through linear regression analysis and simulation, we show the relative merits of each, particularly the circumstances where forecast by CF can be superior to forecast by CI, when CI model is correctly specified and when it is misspecified, and shed some light on the success of equally weighted CF. In our empirical application on prediction of monthly, quarterly, and annual equity premium, we compare the CF forecasts (with various weighting schemes) to CI forecasts (with principal component approach mitigating the problem of parameter proliferation). We find that CF with (close to) equal weights is generally the best and dominates all CI schemes, while also performing substantially better than the historical mean.
    Keywords: Equally weighted combination of forecasts, Equity premium, Factor models, Fore- cast combination, Forecast combination puzzle, Information sets, Many predictors, Principal components, Shrinkage
    JEL: C3 C5 G0
    Date: 2006–03
    URL: http://d.repec.org/n?u=RePEc:ucr:wpaper:200806&r=ecm
  7. By: Magne Mogstad and Matthew Wiswall (Statistics Norway)
    Abstract: Many empirical studies specify outcomes as a linear function of endogenous regressors when conducting instrumental variable (IV) estimation. We show that commonly used tests for treatment effects, selection bias, and treatment effect heterogeneity are biased if the true relationship is non-linear. In particular, using linear models can only lead to under-rejection of the null hypothesis of no treatment effects. In light of these results, we re-examine the recent evidence suggesting that family size has no causal effect on children's education. Following common practice, a linear IV estimator has been used, assuming constant marginal effects of additional children across family sizes. We show that the conclusion of no causal effect of family size is an artifact of the specification of a linear model, which masks significant marginal family size effects. Estimating a model that is non-parametric in family size, we find that family size matters substantially for children's educational attainment, but in a non-monotonic way. Our findings illustrate that IV estimation of models which relax linearity restrictions is an important addition to empirical research, particularly when OLS estimation and theory suggests the possibility of non-linear causal effects.
    Keywords: Instrumental variables; variable treatment intensity; treatment effect heterogeneity; selection bias; quantity-quality; family size; child outcome
    JEL: C31 C14 J13
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:ssb:dispap:586&r=ecm
  8. By: Jouchi Nakajima (Institute for Monetary and Economic Studies, Bank of Japan (E-mail: jouchi.nakajima-1@boj.or.jp)); Munehisa Kasuya (Research and Statistics Department, Bank of Japan (E-mail: munehisa.kasuya@boj.or.jp)); Toshiaki Watanabe (Professor, Institute of Economic Research, Hitotsubashi University, and Institute for Monetary and Economic Studies, Bank of Japan (E-mail: watanabe@ier.hit-u.ac.jp))
    Abstract: This paper analyzes the time-varying parameter vector autoregressive (TVP-VAR) model for the Japanese economy and monetary policy. The time-varying parameters are estimated via the Markov chain Monte Carlo method and the posterior estimates of parameters reveal the time-varying structure of the Japanese economy and monetary policy during the period from 1981 to 2008. The marginal likelihoods of the TVP-VAR model and other VAR models are also estimated. The estimated marginal likelihoods indicate that the TVP-VAR model best fits the Japanese economic data.
    Keywords: Bayesian inference, Markov chain Monte Carlo, Monetary policy, State space model, Structural vector autoregressive model, Stochastic volatility, Time-varying parameter
    JEL: C11 C15 E52
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:ime:imedps:09-e-13&r=ecm
  9. By: Gonzalo Fernández-de-Córdoba (Universidad de Salamanca); José L. Torres (Universidad de Málaga)
    Abstract: During the past ten years Dynamic Stochastic General Equilibrium (DSGE) models have become an important tool in quantitative macroeconomics. However, DSGE models was not considered as a forecasting tool until very recently. The objective of this paper is twofold. First, we compare the forecasting ability of a canonical DSGE model for the Spanish economy with other standard econometric techniques. More precisely, we compare out-of-sample forecasts coming from different estimation methods of the DSGE model to the forecasts produced by a VAR and a Bayesian VAR. Second, we propose a new method for combining DSGE and VAR models (Augmented VAR-DSGE) through the expansion of the variable space where the VAR operates with artificial series obtained from a DSGE model. The results indicate that the out-of-sample forecasting performance of the proposed method outperforms all the considered alternatives.
    Keywords: DSGE models, forecasting, VAR, BVAR
    JEL: C53 E32 E37
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:mal:wpaper:2009-1&r=ecm
  10. By: Georges Dionne; Geneviève Gauthier; Nadia Ouertani
    Abstract: Basket options are among the most popular products of the new generation of exotic options. This attraction is explained by the fact that they can efficiently and simultaneously hedge a wide variety of intrinsically different financial risks. They are flexible enough to include all the risks faced by non-financial firms. Unfortunately, the existing literature on basket options considers only homogeneous baskets where all the underlying assets are identical and hedge the same kind of risk. Moreover, the empirical implementation of basket-option models is not yet well developed, particularly when they are composed of heterogeneous underlying assets. This paper focus on the modelization and the parameters estimation of basket options on commodity price with stochastic convenience yield, exchange rate, and domestic and foreign zero-coupon bonds in a stochastic interest rates setting. We empirically compare the performance of the heterogeneous basket option to that of a portfolio of individual options. The results show that the basket strategy is less expensive and more efficient. We apply the maximum-likelihood method to estimate the different parameters of the theoretical basket model as well as the correlations between the variables. Monte Carlo studies are conducted to examine the performance of the maximum-likelihood estimator in finite samples of simulated data. A real data study is presented.
    Keywords: Basket options, maximum likelihood, hedging performance, options pricing, Monte Carlo simulation
    JEL: C15 C16 G10 G13
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:lvl:lacicr:0918&r=ecm
  11. By: Giuseppe Cavaliere (Department of Statistical Sciences, University of Bologna); Anders Rahbek (Department of Economics, University of Copenhagen and CREATES); A.M.Robert Taylor (School of Economics and Granger Centre for Time Series Econometrics, University of Nottingham)
    Abstract: We analyse the properties of the conventional Gaussian-based co-integrating rank tests of Johansen (1996) in the case where the vector of series under test is driven by globally stationary, conditionally heteroskedastic (martingale differ- ence) innovations. We first demonstrate that the limiting null distributions of the rank statistics coincide with those derived by previous authors who assume either i.i.d. or (strict and covariance) stationary martingale difference innovations. We then propose wild bootstrap implementations of the co-integrating rank tests and demonstrate that the associated bootstrap rank statistics replicate the first-order asymptotic null distributions of the rank statistics. We show the same is also true of the corresponding rank tests based on the i.i.d. bootstrap of Swensen (2006). The wild bootstrap, however, has the important property that, unlike the i.i.d. bootstrap, it preserves in the re-sampled data the pattern of heteroskedasticity present in the original shocks. Consistent with this, numerical evidence sug- gests that, relative to tests based on the asymptotic critical values or the i.i.d. bootstrap, the wild bootstrap rank tests perform very well in small samples un- der a variety of conditionally heteroskedastic innovation processes. An empirical application to the term structure of interest rates is given.
    Keywords: Co-integration, trace and maximum eigenvalue rank tests, conditional heteroskedasticity, i.i.d. bootstrap; wild bootstrap
    JEL: C30 C32
    Date: 2009–05–28
    URL: http://d.repec.org/n?u=RePEc:aah:create:2009-22&r=ecm
  12. By: Ole E. Barndorff-Nielsen (The T.N. Thiele Centre for Mathematics in Natural Science, Department of Mathematical Sciences, & CREATES, Aarhus University); Almut E. D. Veraart (School of Economics and Management, Aarhus University and CREATES)
    Abstract: This paper introduces the concept of stochastic volatility of volatility in continuous time and, hence, extends standard stochastic volatility (SV) models to allow for an additional source of randomness associated with greater variability in the data. We discuss how stochastic volatility of volatility can be defined both non–parametrically, where we link it to the quadratic variation of the stochastic variance process, and parametrically, where we propose two new SV models which allow for stochastic volatility of volatility. In addition, we show that volatility of volatility can be estimated by a novel estimator called pre–estimated spot variance based realised variance.
    Keywords: Stochastic volatility, volatility of volatility, non-Gaussian Ornstein–Uhlenbeck process, superposition, leverage effect, L´evy processes.
    JEL: C10 C13 C14 G10
    Date: 2009–07–06
    URL: http://d.repec.org/n?u=RePEc:aah:create:2009-25&r=ecm
  13. By: Chris Bloor; Troy Matheson (Reserve Bank of New Zealand)
    Abstract: We develop a large Bayesian VAR (BVAR) model of the New Zealand economy that incorporates the conditional forecasting estimation techniques of Waggoner and Zha (1999). We examine the real-time forecasting performance as the size of the model increases using an unbalanced data panel. In a realtime out-of-sample forecasting exercise, we find that our BVAR methodology outperforms univariate and VAR benchmarks, and produces comparable forecast accuracy to the judgementally-adjusted forecasts produced internally at the Reserve Bank of New Zealand. We analyse forecast performance and find that, while there are trade offs across different variables, a 35 variable BVAR generally performs better than 8, 13, or 50 variable specifications for our dataset. Finally, we demonstrate techniques for imposing judgement and for forming a semi-structural interpretation of the BVAR forecasts.
    JEL: C11 C13 C53
    Date: 2009–04
    URL: http://d.repec.org/n?u=RePEc:nzb:nzbdps:2009/02&r=ecm
  14. By: Ole E. Barndorff-Nielsen (Aarhus University and CREATES); José Manuel Corcuera (Universitat de Barcelona); Mark Podolskij (ETH Zürich and CREATES)
    Abstract: In this paper we study the asymptotic behaviour of power and multipower variations of stochatstic processes. Processes of the type considered serve in particular, to analyse data of velocity increments of a uid in a turbulence regime with spot intermittency sigma. The purpose of the present paper is to determine the probabilistic limit behaviour of the (multi)power variations of Y , as a basis for studying properties of the intermittency process. Notably the processes Y are in general not of the semimartingale kind and the established theory of multipower variation for semimartingales does not suffice for deriving the limit properties. As a key tool for the results a general central limit theorem for triangular Gaussian schemes is formulated and proved. Examples and an application to realised variance ratio are given.
    Keywords: Central Limit Theorem; Gaussian Processes; Intermittency; Nonsemimartingales; Turbulence; Volatility; Wiener Chaos
    JEL: C10 C80
    Date: 2009–05–26
    URL: http://d.repec.org/n?u=RePEc:aah:create:2009-21&r=ecm
  15. By: Filippo Domma; Sabrina Giordano (Dipartimento di Economia e Statistica, Università della Calabria); Mariangela Zenga (Dipartimento di Metodi Quantitativi, Università degli Studi di Milano-Bicocca)
    Abstract: In this note, we provide the mathematical tools for computing the entries of the Fisher information matrix in case of the observations are doubly censored from a Dagum distribution.
    Keywords: Order Statistics, Maximum Likelihood Estimator, Fisher Information Matrix
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:clb:wpaper:200908&r=ecm
  16. By: Tara M. Sinclair (Department of Economics The George Washington University); Fred Joutz (Department of Economics The George Washington University); Herman O. Stekler (Department of Economics The George Washington University)
    Abstract: This paper reconciles contradictory findings obtained from forecast evaluations: the existence of systematic errors and the failure to reject rationality in the presence of such errors. Systematic errors in one economic state may offset the opposite types of errors in the other state such that the null of rationality is not rejected. A modified test applied to the Fed forecasts shows that the forecasts were ex post biased.
    Keywords: Greenbook Forecasts, forecast evaluation, systematic errors
    JEL: C53 E37 E52 E58
    Date: 2008–08
    URL: http://d.repec.org/n?u=RePEc:gwc:wpaper:2008-010&r=ecm
  17. By: Gloria González-Rivera (Department of Economics, University of California Riverside); Tae-Hwy Lee (Department of Economics, University of California Riverside)
    Date: 2007–09
    URL: http://d.repec.org/n?u=RePEc:ucr:wpaper:200803&r=ecm
  18. By: David Baqaee (Reserve Bank of New Zealand)
    Abstract: This paper uses wavelets to develop a core inflation measure for inflation targeting central banks. The analysis is applied to the case of New Zealand – the country with the longest history of explicit inflation targeting. We compare the performance of our proposed measure against some popular alternatives. Our measure does well at identifying a reliable medium-term trend in inflation. It also has comparable forecasting performance to standard benchmarks.
    JEL: C32 E31 E52 E58
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:nzb:nzbdps:2009/05&r=ecm
  19. By: Li LIN (ETH Zurich and Beihang University); Ruo En REN (Beihang University); Didier SORNETTE (ETH Zurich and Swiss Finance Institute)
    Abstract: We present a self-consistent model for explosive financial bubbles, which combines a mean-reverting volatility process and a stochastic conditional return which reflects nonlinear positive feedbacks and continuous updates of the investors’ beliefs and sentiments. The conditional expected returns exhibit faster-than-exponential acceleration decorated by accelerating oscillations, called “log-periodic power law.” Tests on residuals show a remarkable low rate (0.2%) of false positives when applied to a GARCH benchmark. When tested on the S&P500 US index from Jan. 3, 1950 to Nov. 21, 2008, the model correctly identifies the bubbles ending in Oct. 1987, in Oct. 1997, in Aug. 1998 and the ITC bubble ending on the first quarter of 2000. Different unit-root tests confirm the high relevance of the model specification. Our model also provides a diagnostic for the duration of bubbles: applied to the period before Oct. 1987 crash, there is clear evidence that the bubble started at least 4 years earlier. We confirm the validity and universality of the volatility-confined LPPL model on seven other major bubbles that have occurred in the World in the last two decades. Using Bayesian inference, we find a very strong statistical preference for our model compared with a standard benchmark, in contradiction with Chang and Feigenbaum [2006] which used a unit-root model for residuals.
    Keywords: Rational bubbles; mean reversal; positive feedbacks; finite-time singularity; superexponential growth; Bayesian analysis; log-periodic power law
    JEL: C11
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:chf:rpseri:rp0914&r=ecm
  20. By: Alexander Subbotin (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I); Thierry Chauveau (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I); Kateryna Shapovalova (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I)
    Abstract: We overview different methods of modeling volatility of stock prices and exchange rates, focusing on their ability to reproduce the empirical properties in the corresponding time series. The properties of price fluctuations vary across the time scales of observation. The adequacy of different models for describing price dynamics at several time horizons simultaneously is the central topic of this study. We propose a detailed survey of recent volatility models, accounting for multiple horizons. These models are based on different and sometimes competing theoretical concepts. They belong either to GARCH or stochastic volatility model families and often borrow methodological tools from statistical physics. We compare their properties and comment on their pratical usefulness and perspectives.
    Keywords: Volatility modeling, GARCH, stochastic volatility, volatility cascade, multiple horizons in volatility.
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-00390636_v1&r=ecm
  21. By: Askitas, Nikos (IZA); Zimmermann, Klaus F. (IZA, DIW Berlin and Bonn University)
    Abstract: The current economic crisis requires fast information to predict economic behavior early, which is difficult at times of structural changes. This paper suggests an innovative new method of using data on internet activity for that purpose. It demonstrates strong correlations between keyword searches and unemployment rates using monthly German data and exhibits a strong potential for the method used.
    Keywords: time-series analysis, internet, Google, keyword search, search engine, unemployment, predictions
    JEL: C22 C82 E17 E24 E37
    Date: 2009–06
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp4201&r=ecm
  22. By: H.O. Stekler (Department of Economics George Washington UniversityAuthor-Name: Kazuta Sakamoto); Kazuta Sakamoto (Department of Economics George WaExponential smoothing and non-negative datashington UniversityAuthor-Name: Kazuta Sakamoto)
    Abstract: Forecasts for the current year that are made sometime during the current year are not true annual forecasts because they include already known information for the early part of the year. The current methodology that evaluates these ¡°forecasts¡± does not take into account the known information. This paper presents a methodology for calculating an implicit forecast for the latter part of a year conditional on the known information. We then apply the procedure to Japanese forecasts for 1988-2003 and analyze some of the characteristics of those predictions.Length: 24 pages
    Keywords: Forecasting, Japanese forecasts, evaluation techniques
    JEL: E37
    Date: 2008–07
    URL: http://d.repec.org/n?u=RePEc:gwc:wpaper:2008-005&r=ecm
  23. By: Dongming Zhu; John Galbraith
    Abstract: Financial returns typically display heavy tails and some skewness, and conditional variance models with these features often outperform more limited models. The difference in performance may be especially important in estimating quantities that depend on tail features, including risk measures such as the expected shortfall. Here, using a recent generalization of the asymmetric Student-t distribution to allow separate parameters to control skewness and the thickness of each tail, we fit daily financial returns and forecast expected shortfall for the S&P 500 index and a number of individual company stocks; the generalized distribution is used for the standardized innovations in a nonlinear, asymmetric GARCH-type model. The results provide empirical evidence for the usefulness of the generalized distribution in improving prediction of downside market risk of financial assets. <P>De façon générale, les rendements financiers sont caractérisés par des queues épaisses et une certaine asymétrie. Ainsi, les modèles à variance conditionnelle dotés de ces caractéristiques donnent de meilleurs résultats que les modèles plus limités. La différence dans les résultats obtenus peut être particulièrement importante lorsqu’il s’agit d’évaluer des quantités qui dépendent des caractéristiques des queues, y compris les mesures du risque, tel que le manque à gagner prévu. Dans le cas actuel, en recourant à une généralisation récente de la distribution asymétrique suivant la loi t de Student, de sorte que des paramètres distincts limitent l’asymétrie et l’épaisseur de chaque queue, nous intégrons les rendements financiers quotidiens et estimons le manque à gagner prévu dans le cas de l’indice S&P 500 et de certaines actions de compagnies individuelles. La distribution généralisée est utilisée pour les innovations normalisées contenues dans un modèle asymétrique non linéaire de type GARCH. Les résultats démontrent de façon empirique l’utilité de la distribution généralisée pour améliorer les prévisions au sujet du risque de perte en cas de baisse du marché des actifs financiers.
    Keywords: asymmetric distribution, expected shortfall, NGARCH model, distribution asymétrique, manque à gagner prévu, modèle NGARCH (Nonlinear Generalized AutoRegressive Conditional Heteroscedasticity)
    JEL: C16 G10
    Date: 2009–05–01
    URL: http://d.repec.org/n?u=RePEc:cir:cirwor:2009s-24&r=ecm
  24. By: Edward N. Gamber (Department of Economics and Business, Lafayette College); Tara M. Sinclair (Department of Economics, George Washington University); H.O. Stekler (Department of Economics, George Washington University); Elizabeth Reid (Department of Economics, George Washington University)
    Abstract: This paper presents a new methodology to evaluate the impact of forecast errors on policy. We apply this methodology to the Federal Reserve forecasts of U.S. real output growth and the inflation rate using the Taylor (1993) monetary policy rule. Our results suggest it is possible to calculate policy forecast errors using joint predictions for a number of variables. These policy forecast errors have a direct interpretation for the impact of forecasts on policy. In the case of the Federal Reserve, we find that, on average, Fed policy based on the Taylor rule was approximately a full percentage point away from the intended target because of errors in forecasting growth and inflation.
    Keywords: Forecast Evaluation, Federal Reserve Forecasts, Monetary Policy
    JEL: C53 E37 E52 E58
    Date: 2008–04
    URL: http://d.repec.org/n?u=RePEc:gwc:wpaper:2008-002&r=ecm
  25. By: Carlo Altavilla (University of Naples Parthenope and CSEF); Matteo Ciccarelli (European Central Bank)
    Abstract: This paper explores the role that the imperfect knowledge of the structure of the economy plays in the uncertainty surrounding the effects of rule-based monetary policy on unemployment dynamics in the euro area and the US. We employ a Bayesian model averaging procedure on a wide range of models which differ in several dimensions to account for the uncertainty that the policymaker faces when setting the monetary policy and evaluating its effect on real economy. We find evidence of a high degree of dispersion across models in both policy rule parameters and impulse response functions. Moreover, monetary policy shocks have very similar recessionary effects on the two economies with a different role played by the participation rate in the transmission mechanism. Finally, we show that a policy maker who does not take model uncertainty into account and selects the results on the basis of a single model may come to misleading conclusions not only about the transmission mechanism, but also about the differences between the euro area and the US, which are on average essentially small.
    Keywords: Monetary policy, Model uncertainty, Bayesian model averaging, Unemployment gap, Taylor rule
    JEL: C11 E24 E52 E58
    Date: 2009–06–07
    URL: http://d.repec.org/n?u=RePEc:sef:csefwp:231&r=ecm

This nep-ecm issue is ©2009 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.